
Advanced Approximation Techniques for Functions
Explore advanced approximation methods such as least square approximation, orthogonal basis functions, norm, and seminorm concepts, inner product properties, basis functions linear independence, orthogonal functions, and the least square problem in the context of continuous functions. Understand the principles behind these techniques and their applications in mathematical modeling.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
Approximation of Functions Approximation problems divided into five parts: Least square approximation of discrete functions or Regression Least square approximation of continuous function using various basis polynomials Orthogonal basis functions Approximation of periodic functions Interpolation
Norm and Seminorm A real valued function is called a norm on a vector space if it is defined everywhere on the space and satisfies the following conditions: ? > 0 for? 0; ? = 0 iff ? = 0 ? ?,? ?? = ? ? for a real ? ? + ? ? + ? 1? ?? ? ??? Lp-Norm: ??= ? 12 ?? ? 2?? p = 2: Euclidean or L2norm, ?2= ? p : ? = max ? ?,?? ? 12 ?? ? 2? ? ?? Weighted Euclidean Norm: ?2,?= ? 12 2 ? Euclidean seminorm: tab ? 2,?= ?=0 ? ??
Inner Product The inner product of two real-valued continuous functions f(x) and g(x) is denoted by ?,? and is defined as: ? ? ? ? ? ? ? ?? Continuous Case ? ?,? = ? ? ??? ???? (Discrete Case) ?=0 Properties of Inner Product: Commutativity: ?,? = ?,? Linearity: (?1? + ?2?),? = ?1?,? + ?2?,? Positivity: ?,? 0 12]2 ?? ? 2= [ ? 2?? ?,? = ?2
Basis Functions: Linear Independence A sequence of (n + 1) functions ?0,?1,?2, ?? are linearly independent if, ? ????= 0 ??= 0 ? ?=0 This sequence of functions builds (or spans) an (n + 1)-dimensional linear subspace From the Definition of Norm: ? ???? = 0 is true only if ??= 0 ? ?=0 Linearity of inner product: ? ? ????,?? = ????? ,??? ?=0 ?=0
Orthogonal Functions Two real-valued continuous functions f(x) and g(x) are said to be orthogonal if ?,? = 0 A finite or infinite sequence of functions ?0,?1,?2, ??, make an orthogonal system if ??,?? = 0 for all i j and ?? 0 for all i. In addition, if ?? = 1, the sequence of functions is called an orthonormal system 2= Pythagorean Theorem for functions: ?,? = 0 ? ? + ? ? + ? , ? + ? ? ? + ? 2+ ? 2 2= = ?,? + ?,? + ?,? + ?,? = 2+ 0 + 0 + ? 2 2= ?=0 2 ? ? 2?? Generalized for orthogonal system: ?=0 ???? ??
Least Square Problem Let f(x) be a continuous real-valued function in (a, b) that is to be approximated by p(x), a linear combination of a system of (n + 1) linearly independent functions ?0,?1,?2, ?? as shown below: ? ? ? = ????? ?=0 such that, a weighted Euclidean Determine the coefficients ??= ?? norm or seminorm of error p(x) f(x) becomes as small as possible. ?? ? ? ? 2= ? 2= ?=1 2?? is minimum when ??= ?? ? ? ? ? ? 2is minimum when ??= ?? ? ? ? ? ? ?? ? ?? ? ??? Least Square Solution: ? ? = ?=0 ??
Schematic of Least Square Solution f(x) Space Spanned by a system of (n + 1) linearly independent functions ?0,?1,?2, ?? ? ?? ? ? = ?? ? ?=0 ? ? = ???? ?=0 Solution: ? ? ? ? ,?? = 0 for k= 0, 1, 2, n
Least Square Solution: Proof When ?0,?1,?2, ??are linearly independent, the least square problem has a unique solution: ? ??? ? ? = ?? ?=0 where, p*(x) f(x) is orthogonal to all j s, j = 0, 1, 2, n. We need to prove: ? ? ? ? all j s, j = 0, 1, 2, n existence and uniqueness of the least square solution 2 is minimum when p*(x) f(x) is orthogonal to
Least Square Solution: Proof for at least Let ?0,?1,?2, ??be another sequence of coefficients with ?? ?? one j, then ? ? ??? + ? ? ? ? ????? ? ? = ?? ?? ?=0 ?=0 If p*(x) f(x) is orthogonal to all j s, it is also orthogonal to their linear combination ?=0 ?? ?? ? ??? . According to Pythagorean theorem, we have: 2 2 ? ? ??? + ? ? ? ? 2 ????? ? ? = ?? ?? ?=0 ?=0 > ? ? ? ? 2 Therefore, if p*(x) f(x) is orthogonal to all j s, then p*(x) is the solution of the least square problem.
Least Square Solution: Normal Equations If ?0,?1,?2, ??are linearly independent, solution to the least square problem is: ? ? ? ? ,??? = 0 ? = 0,1,2, ? ? ??? ? ? = where, ?? ?=0 Therefore, ? ??? ? ? ?? ,??? = 0 ? = 0,1,2, ? ?=0 ? ??? ,??? ?? = ? ? ,??? ? = 0,1,2, ? ?=0 Normal Equations! Normal Equations!
Least Square Solution: Normal Equations ? ??? ,??? ?? = ? ? ,??? ; ? = 0,1,2, ? ?=0 ?0,?0+ ?1 ?0,?1+ ?1 ?0,?2+ ?1 ?0,?? + ?1 ?1,?0+ ?2 ?1,?1 + ?2 ?1,?2+ ?2 ?1,?? + ?2 ?2,?0+ + ?? ?2,?1+ + ?? ?2,?2+ + ?? ?2,?? + + ?? ??,?0 = ?,?0 ??,?1 = ?,?1 ??,?2 = ?,?2 ??,?? = ?,?? ?0 ?0 ?0 ?0 ? = 0 ? = 1 ? = 2 + + + + = ? = ? Moreover, if ?0,?1,?2, ?? is an orthogonal system: ?,?? ??,?? = ?? ? = 0,1,2, ?
Least Square Solution: Existence and Uniqueness ? ??? ,??? ?? = ? ? ,??? ; ? = 0,1,2, ? ?=0 ?0 ?0 ?0 ?0 ?0,?0 + ?1 ?0,?1 + ?1 ?0,?2 + ?1 ?0,?? + ?1 ?1,?0 + ?2 ?1,?1 + ?2 ?1,?2 + ?2 ?1,?? + ?2 ?2,?0 + + ?? ?2,?1 + + ?? ?2,?2 + + ?? ?2,?? + + ?? ??,?0 = ?,?0 ??,?1 = ?,?1 ??,?2 = ?,?2 ??,?? = ?,?? ? = 0 ? = 1 ? = 2 + + + + = ? = ? Solution to the normal equations exist and is unique unless the following ,i.e., ?? 0 for ,?1 , ?? homogenous system has a nontrivial solution for ?0 at least one j: ? ??? ,??? ?? =0 ?=0
Least Square Solution: Existence and Uniqueness Solution to the normal equations exist and is unique unless the following ,i.e., ?? 0 for ,?1 , ?? homogenous system has a nontrivial solution for ?0 at least one j: ? ??? ,??? ?? =0 ?=0 This would lead to 2 ? ? ? ? ? ? ?? ??, ?? ?? = = = 0 ?? = ?? ?? ??,???? 0.?? ?=0 ?=0 ?=0 ?=0 ?=0 ?=0 which contradicts that the ?0,?1,?2, ??are linearly independent.
Least Square Solution: Example (Continuous) Approximate the function f(x) = 1/(1 + x2) for x in [0, 1] using a straight line. 1 ??? The basis functions are: ?0? = ?0= 1; ?1? = ?1= ?; ? ? = ?=0 ?? ? ??? ,??? ?0 ?0 Normal Equation: ?=0 ?? = ? ? ,??? ?1,?0 = ?,?0 ?1,?1 = ?,?1 ; ? = 0,1,2, ? ?0,?0+ ?1 ?0,?1+ ?1 1 1 ? 1 ?? =1 ?0,?0 = 1 1 ?? = 1; ?0,?1 = ?1,?0 = 2= 0.5 0 0 1 1 ? ? ?? =1 1 + ?21 ?? =? 1 ?1,?1 = 3; ?,?0 = 4 0 0 1 1 + ?2? ?? =ln2 1 ?,?1 = 2 0 ? ?,? = ? ? ? ? ? ? ?? Continuous Case ?
Least Square Solution: Example (Continuous) ?0,?0+ ?1 ?0,?1+ ?1 ?1,?0 = ?,?0 ?1,?1 = ?,?1 ?0 ?0 = ?0 ?1 ?/4 ln2/2 1 0.5 1/3 0.5 ?/4 ln2/2 1 0.5 0.5 1/3 0.5 1/3 = ?0 = 1.062 1 ?/4 ln2/2 0.5 1/3 0.5 = ?1 = 0.5535 1 0.5 1 ??? = 1.062 0.5535? ? ? = ?? ?=0
Least Square Solution: Example (Continuous) Approximate the function f(x) = 1/(1 + x2) for x in [0, 1] using a 2nd order polynomial. 2 ?? The basis functions are: ?0= 1; ?1= ?; ?2= ?2; ? ? = ?=0 ?0 ?0 ?0 ?? ?0,?0 + ?1 ?0,?1 + ?1 ?0,?2 + ?1 ?1,?0 + ?2 ?1,?1 + ?2 ?1,?2 + ?2 ?2,?0 = ?,?0 ?2,?1 = ?,?1 ?2,?2 = ?,?2 Additional inner products to be evaluated are: 1 1 ?? =1 ?2 ?0,?2 = ?2,?0 = 3 0 1 1 ? ?? =1 ?2 ?? =1 ?2 ?2 ?1,?2 = ?2,?1 = 4= 0.25; ?2,?2 = 5= 0.2 0 0 1 1 + ?2?2 ?? = 1 1 ?,?2 = 4 0
Least Square Solution: Example (Continuous) ?0,?0 + ?1 ?0,?1 + ?1 ?0,?2+ ?1 ?1,?0+ ?2 ?1,?1+ ?2 ?1,?2+ ?2 ?2,?0 = ?,?0 ?2,?1 = ?,?1 ?2,?2 = ?,?2 ?0 ?0 ?0 ?0 ?1 ?2 1 0.5 1/3 0.25 1/3 0.25 0.2 ?/4 ln2/2 1 ?/4 0.5 1/3 = Solving by Gauss Elimination: = 1.030; = 0.3605; = 0.1930 ?0 ?1 ?2 2 ??? = 1.03 0.3605? 0.193?2 ? ? = ?? ?=0
Least Square Solution: Example (Continuous) How do you judge how good the fit is or how do you compare fit of two polynomials? Estimate ? ? ? ? Two options: Analytical (+ numerical) Numerical (+ visual) .
Discrete Data (n + 1) observations or data pairs [(x0, y0), (x1, y1), (x2, y2) (xn, yn)] (m + 1) basis functions: ?0,?1,?2, ?? Approximating polynomial: ? ? = ?=0 ?0?0?0 + ?1?1?0 + ?2?2?0 + + ?????0 = ?0 ?0?0?1 + ?1?1?1 + ?2?2?1 + + ?????1 = ?1 :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: ?0?0?? + ?1?1?? + ?2?2?? + + ?????? = ?? n equations, m unknowns: m < n: over-determined system, least square regression m = n: unique solution, interpolation m > n: under-determined system ? ?????
Least Square Solution: Example (Discrete Data) Variation of Ultimate Shear Strength (y) with curing Temperature (x) for a certain rubber compound was reported (J. Quality Technology, 1971, pp. 149-155) as: x, in oC y, in psi 138 770 140 800 144.5 840 146 810 148 735 151.5 640 153.5 590 157 560 Fit a linear and a quadratic regression model to the data. For the linear model, the basis functions and the polynomial are: 1 ??? ?0? = ?0= 1; ?1? = ?1= ?; ? ? = ?=0 ?0 ?0 ?? ?0,?0+ ?1 ?0,?1+ ?1 ?1,?0 = ?,?0 ?1,?1 = ?,?1 Given vectors: x = [138, 140, 144.5, 146, 148, 151.5, 153.5, 157] and y = f(x) = [770, 800, 840, 810, 735, 640, 590, 560] No. of data points, m = 8. Denote elements of x and y as xi and yi, respectively.
Least Square Solution: Example (Discrete Data) ? ? ? 2= 173907.75 ?0,?0 = 1 1 = 8; ?1,?1 = ?? ?? = ?? ?=1 ?=1 ?=1 ? ? ?1,?0 = ?0,?1 = 1 ?? = ??= 1178.5 ?=1 ?=1 ? ? ?,?0 = ?? 1 = 5745; ?,?1 = ?? ?? = 842125 ?=1 ?=1 ?0,?0 + ?1 ?0,?1 + ?1 ?1,?0 = ?,?0 ?1,?1 = ?,?1 ?0 ?0 = ?0 ?1 8 1178.5 173907.75 5745 842125 = 2773.50; ?1 = 13.9525 ?0 1178.5 1 ??? = 2773.5 13.9525? ? ? = ?? ?=0
Least Square Solution: Example (Discrete Data) For the quadratic model, the basis functions and the polynomial are: 2 ?? ?0= 1; ?1= ?; ?2= ?2; ? ? = ?=0 ?? ?0,?0+ ?1 ?0,?1+ ?1 ?0,?2+ ?1 ?1,?0+ ?2 ?1,?1+ ?2 ?1,?2+ ?2 ?2,?0 = ?,?0 ?2,?1 = ?,?1 ?2,?2 = ?,?2 ?0 ?0 ?0 Additional inner products to be evaluated are: ? 2= 173907.75 ?0,?2 = ?2,?0 = 1 ?? ?=1 ? ? 3= 25707160 2= ?1,?2 = ?2,?1 = ?? ?? ?? ?=1 ?=1 ? ? 2 2= 4= 3806534454 ?2,?2 = ?? ?? ?? ?=1 ?=1 ? 2= 123643297.5 ?,?2 = ?? ?? ?=1
Least Square Solution: Example (Discrete Data) ?0,?0 + ?1 ?0,?1 + ?1 ?0,?2+ ?1 ?1,?0+ ?2 ?1,?1+ ?2 ?1,?2+ ?2 ?2,?0 = ?,?0 ?2,?1 = ?,?1 ?2,?2 = ?,?2 ?0 ?0 ?0 ?0 ?1 ?2 8 1178.5 173907.75 25707160 173907.75 25707160 3806534454 5745 842125 123643297.5 = 1178.5 173907.75 Solving by Gauss Elimination: ?0 = 21935.4; = 322.103; = 1.14067 ?1 ?2 2 ??? = 21935.4 + 322.103? 1.14067?2 ? ? = ?? ?=0
Least Square Solution: Example (Discrete Data) How do you judge how good the fit is or how do you compare fit of two polynomials? Square of errors!
Least Square Solution: Example (Discrete Data) Denote: ??= ? ??, ??= ? ?? ??= ?=0 ? + 1, ? ? ??? 2 ??2= ?? ?? ?=0 R2 = 0.73 ?2= 2 ?? ?? ?=0 Coefficient of regression r or R is given by: ?2=??2 ?2 R2 = 0.88 ??2
Additional Points You can do multiple regression using the same frame work. Linearize some non-linear equations Evaluate Integrals using Numerical Methods for Integration for functions that are not easy to integrate using analytical means!
ESO 208A: Computational Methods in Engineering Orthogonal Basis, Periodic Functions
Least Square Solution: Normal Equations ? ??? ,??? ?? = ? ? ,??? ; ? = 0,1,2, ? ?=0 If ?0,?1,?2, ?? is an orthogonal system: ?,?? ??,?? = ?? ? = 0,1,2, ? Let us explore orthogonal systems of basis functions!
Orthogonal Polynomials: Tchebycheff Consider the Equality: cos ? + 1 ? + cos ? 1 ? = 2cos?cos?? ? 1 n = 1: cos2? = 2cos2? 1 n = 2: cos3? = 2cos?cos2? cos? = 4cos3? 3cos? Define:? = cos?,? 0,? and ??? = cos?? = cos ?cos 1? Recursion Formula:??+1? = 2???? ?? 1? Example: ?0? = 1; ?1? = ?; ?2? = 2?2 1 Symmetry: ?? ? = 1???? Leading Coefficient: 2n-1 for n 1 and 1 for n = 0 Tn(x) constitutes an orthogonal family of polynomials in [-1, 1]!
Orthogonal Polynomials: Tchebycheff Tn(x) has n zeros in [-1, 1] calledtheTchebycheff abscissae: 2? + 1 ? ? 2 ??= cos ? = 0,1,2 ? 1 Follows from cos?? = 0 for ? =2? + 1 ? 2 ? Tn(x) has n+1 extrema in [-1, 1]: ?? ? Follows from cos?? has maxima at ? =?? ; ???? = 1? ??= cos ? = 0,1,2 ? ?
Orthogonal Polynomials: Tchebycheff Orthogonality (continuous): ? ??? ,??? = cos??cos???? 0 1 1 = ??? ??? 1 ?2?? 1 0 ? 2 if ? = ? 0 ? if ? = ? = 0 if ? ? = For arbitrary f(x), a x b: ? =?+? 2+? ? 2? 1 ? 1
Orthogonal Polynomials: Tchebycheff Orthogonality (discrete): For 0 m N and0 n N ? ??? ,??? = ???????? ?=0 0 if ? ? 1 2? + 1 if ? = ? 0 ? + 1 if ? = ? = 0 = where, {xk} are the zeros of TN+1(x)
Orthogonal Polynomials: Legendre Solution of the Legendre s equation (for n non-negative: ? ?? 1 ?2?? + ? ? + 1 ? = 0 1 ? 1 ?? Solutions are an orthogonal set of polynomials given by: ?? ??? 1 ?2 1? ?0? = 1; ??? = 2??! Bonnet s recursive relation for n 2: ??? =2? 1 ??? 1? ? 1 ?? 2? ? ? Examples: ?0? = 1; ?1? = ?; ?2? =1 23?2 1 ; ?3? =1 25?3 3?
Orthogonal Polynomials: Legendre Symmetry: ?? ? = 1???? Orthogonality (continuous): 0 if ? ? 2 ??? ,??? = 2? + 1 if ? = ? ? ? = 1; ??? 1 For discrete data: equidistant data points.
Orthonormal Polynomials: Gram For Equidistant Discrete Data: For 1 ? 1, a net of (n + 1) equidistant points are given by: ??= 1 +2? ? for ? = 0,1,2, ? ? On this net, the orthonormal set of polynomials ??? are given by: ?=0 1 ?? ?? 1?? 1? ? 1? = 0; ?0? = ? + 1; ??+1? = ?????? 4 ? + 12 1 ? + 12 ? + 12 ? ??= ? = 0,1,2, ? 1 ? + 1 Example (for n = 5): 1 5? 70; ?2? =25 3 7?2 5 7 3 ?0? = 6; ?1? = 16 16
Orthonormal Polynomials: Gram Orthogonality: ? ???????? = 0 if ? ? ??? ,??? = 1 if ? = ? ?=0 When m << n1/2, Gm(x) are very similar to the Legendre polynomials When n << m1/2, Gm(x) have very large oscillations between the net points, large maximum norm in [-1, 1] When fitting a polynomial to equidistant data, one should never choose m larger than ~ 2n1/2
Least Square Solution: Example (Continuous) Approximate the function f(x) = 1/(1 + x2) for x in [0, 1] using a 2nd order polynomial using Legendre polynomials. For Legendre polynomials, use x = (z + 1)/2 such that for x in [0, 1], z is in [-1, 1] The function is: f(z) = 4/(5 + 2z + z2) 2 ?? 23?2 1 ; ? ? = ?=0 The basis functions are: ?0= 1; ?1= ?; ?2=1 ?? 1 5 + 2? + ?2?? = 4 ?,?0 = 2= 1.5708 1 1 5 + 2? + ?2?? = 2ln2 4? ?,?1 = 2= 0.1845 1 141 5 + 2? + ?2?? = 12 6ln2 5 23?2 1 2= 0.1286 10 1 ?,?2 = 1 = 0.1286 10 1 =1.5708 = 0.1845 = 0.3216 10 1 ?0 = 0.7854;?1 = 0.2768;?2 2 2/3 2/5
Least Square Solution: Example (Continuous) = 0.1286 10 1 =1.5708 = 0.1845 ?0 = 0.7854;?1 = 0.2768;?2 2 2/3 2/5 = 0.3216 10 1 2 ??? = 0.7854 0.2768? 0.3216 10 1 1 ? ? = 23?2 1 ?? ?=0 = 0.8015 0.2768? 0.4824 10 1?2 If you now use, z = 2x 1 ? ? = 0.8015 0.2768 2? 1 0.4824 10 12? 12 = 1.030 0.3605? 0.193?2 Least square polynomial is unique! It does not depend on the basis!