Understanding Orthogonal Polynomials for Least Squares Approximations

mat 4725 numerical analysis n.w
1 / 16
Embed
Share

Dive into the concept of orthogonal polynomials and their role in least squares approximations. Explore how to apply least squares regression to fit data points and functions, with a focus on minimizing errors for accurate approximations.

  • Polynomials
  • Least Squares
  • Numerical Analysis
  • Regression
  • Approximation

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. MAT 4725 Numerical Analysis Section 8.2 Orthogonal Polynomials and Least Squares Approximations Part I http://myhome.spu.edu/lauw

  2. We have looked at how to... Local approximation What about....

  3. Review Least Squares Approximation (Least Squares Regression)

  4. Linear Least Squares Approx. Given data points ??,?? want to fit a straight line to the data. ?=1,2,..,?. We

  5. Linear Least Squares Approx. = + y a 1 ix a 0 = + y a a x 0 1 ( , ) ix y i

  6. Normal Equations m m + = ma a x y 0 1 i i = = 1 1 i i m m m + = 2 a x a x x y 0 1 i i i i = = = 1 1 1 i i i Solve for and a a 0 1

  7. General Case Given data points ??,?? want to approximate the data with ?=1,2,..,?. We n = = + + + n k ( ) P x a a x a x a x 0 1 n n k = 0 k Find such that a k m 2 = ( ) E y P x i n i = 1 i is minimized

  8. Least Squares Approximation Apply the Least Squares Approximation to functions (instead of data sets) Think of a function as a data set such that each point of the function is one datum.

  9. Least Squares Approximation of Functions Given approximate ( ) by [ , ], C a b f x f ( ) f x n = k ( ) n P x a x ( ) nP x k = 0 k a b

  10. Least Squares Approximation of Functions Find such that a k b ( ) f x 2 = ( ) f x ( ) E P x dx n a ( ) nP x is minimized a b n = k ( ) P x a x n k = 0 k

  11. Normal Equations = b b n + = k j j ( ) f x x dx , 0,1, , a x dx j n k = 0 k a a Solve for a k

  12. Example 1 Find the least squares approx. of ?(?) = sin(??) on [0,1] by ?2(?). = b b n + = k j j ( ) f x x dx , 0,1, , a x dx j n k = 0 k a a

  13. Example 1 = ( ) f x sin( ) x 2( ) P x

  14. Disadvantages Solving (? + 1) (? + 1) linear system Computing the approx. ??(?) does not help the computation of ??+1(?).

  15. Homework Download Homework Read the Definitions again I do not have a solution in SageMath. I am counting on using your program for the future solution key!

  16. Classwork Finish at least the first problem of your HW.

Related


More Related Content