Understanding Orthogonal Projection and Complement in Linear Algebra

orthogonal projection hung yi lee n.w
1 / 35
Embed
Share

Explore the concepts of orthogonal projection and complement in linear algebra, including their definitions, properties, and applications. Learn about the orthogonal complement of vector sets, applications in subspaces, and examples illustrating key principles. Get insights into how orthogonal projection plays a crucial role in various mathematical scenarios.

  • Linear Algebra
  • Orthogonal Projection
  • Complement
  • Vector Sets
  • Subspaces

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Orthogonal Projection Hung-yi Lee

  2. Reference Textbook: Chapter 7.3, 7.4

  3. Orthogonal Projection What is Orthogonal Complement What is Orthogonal Projection How to do Orthogonal Projection Application of Orthogonal Projection

  4. Orthogonal Complement The orthogonal complement of a nonempty vector set S is denoted as S (S perp). S is the set of vectors that are orthogonal to every vector in S ? = ?:? ? = 0, ? ? W S = Rn S = {0} S = {0} S = Rn W

  5. Orthogonal Complement The orthogonal complement of a nonempty vector set S is denoted as S (S perp). S is the set of vectors that are orthogonal to every vector in S ? = ?:? ? = 0, ? ? V W : for all v V and w W, v w = 0 ?1 ?2 0 |?1,?2 R ? = W V: since e1, e2 W, all z = [ z1z2z3]T W must have z1= z2= 0 0 0 ?3 W = |?3 R = V

  6. Properties of Orthogonal Complement Yes (? may not be a subspace) Is ? always a subspace? For any nonempty vector set S, ???? ? = ? ? U ? ? ? ? ? ? ? = ?1,?2, ,?? ? ???? ? ?? ? = ? ? = ?1?1+ ?2?2+ + ???? ? ? = ?1?1+ ?2?2+ + ???? ? = ?1?1 ? + ?2?2 ? + + ???? ? = ? ? ?

  7. Properties of Orthogonal Complement Yes (? may not be a subspace) Is ? always a subspace? For any nonempty vector set S, ???? ? = ? Let W be a subspace, and B be a basis of W. ? = ? What is ? ? ? Zero vector

  8. Properties of Orthogonal Complement Example: For W = Span{u1, u2}, where u1= [ 1 1 1 4 ]Tand u2=[ 1 1 1 2 ]T v W if and only if u1 v = u2 v = 0 i.e., v = [ x1x2x3x4]Tsatisfies is a basis for W . W = Solutions of Ax=0 = Null A

  9. Properties of Orthogonal Complement For any matrix A ??? ? = ???? ? v (Row A) For all w Span{rows of A}, w v = 0 Av = 0. C?? ? = ???? ?? (Col A) = (Row AT) = Null AT . ???? + ???? = ? For any subspace W of Rn

  10. w ? = ?1?1+ ?2?2+ + ???? +?1?1+ ?2?2+ + ?? ??? ? Unique z ???? + ???? = ? For any subspace W of Rn Basis: ?1,?2, ,?? Basis: ?1,?2, ,?? ? Basis for Rn For every vector u, W u u = w + z (unique) z W w ? ? 0

  11. Orthogonal Projection What is Orthogonal Complement What is Orthogonal Projection How to do Orthogonal Projection Application of Orthogonal Projection

  12. orthogonal projection Orthogonal Projection ? u = w + z (unique) ? ? ? Orthogonal Projection Operator: The function ??? is the orthogonal projection of u on W. u = w + z u = w + z u+u = (w+w ) + (z+z ) Linear? ku = kw + kz

  13. Orthogonal Projection w in subspace W is closest to vector u. ? ? closest Always orthogonal z is always orthogonal to all vectors in W.

  14. Closest Vector Property Among all vectors in subspace W, the vector closest to u is the orthogonal projection of u on W u w, w , w w W. u w ' (u w) (w w ) = 0. u w u w 2 w w = (u w) + (w w ) 2 = u w 2+ w w 2 W w = UW(u) 0 > u w 2 The distance from a vector u to a subspace W is the distance between u and the orthogonal projection of u on W

  15. Orthogonal Projection Matrix Orthogonal projection operator is linear. Orthogonal Projection Matrix Pw It has standard matrix. u z = u - w = ??? w = UW(u) 0 = ??? v = UW(v) W

  16. Orthogonal Projection What is Orthogonal Complement What is Orthogonal Projection How to do Orthogonal Projection Application of Orthogonal Projection

  17. Orthogonal Projection on a line Orthogonal projection of a vector on a line ? ? = 0 v v: any vector u: any nonzero vector on L w: orthogonal projection of v onto L, w = cu z: v w L z z u w 2 ? ? ? = ? ?? ? = ? ? ?? ? = ? ? ? ? ? =? ? ?2 ? = ?? =? ? =0 ?2? = ? ? ? Distance from tip of v to L: ?2? ? = ? ?

  18. ? =? ? Orthogonal Projection ?2 ? = ?? =? ? ?2? Example: ? ? = 0 v L is y = (1/2)x L z ? =4 ? =2 z u w 1 1 ? =? ? ?2? =9 2 1 1 9 ? = ? ? =4 2 1 5 5

  19. Orthogonal Projection Matrix Let C be an n x k matrix whose columns form a basis for a subspace W ??= ? ??? 1?? kxn nxk n x n kxn nxk Proof: Let u Rnand w = UW(u). Since W = Col C, w = Cb for some b Rk and u w W 0 = CT(u w) = CTu CTw = CTu CTCb. CTu = CTCb. b = (CTC) 1CTu and w = C(CTC) 1CTu as CTC is invertible.

  20. Orthogonal Projection Matrix Let C be an n x k matrix whose columns form a basis for a subspace W ??= ? ??? 1?? n x n Let C be a matrix with linearly independent columns. Then ??? is invertible. Proof: We want to prove that CTC has independent columns. Suppose CTCb = 0 for some b. bTCTCb = (Cb)TCb = (Cb) (Cb) = Cb 2= 0. Cb = 0 b = 0 since C has linear independent columns. Thus CTC is invertible.

  21. Orthogonal Projection Matrix Example: Let W be the 2-dimensional subspace of R3with equation x1 x2+2x3= 0. ??= ? ??? 1?? ? 1 1 , 0 0 2 1 1 0 2 0 1 ? = W has a basis 1 1 1 0 2 0 1 1 3 4 0 4 2 ??= ?? =

  22. ? ???1??

  23. Orthogonal Projection Let ? = ?1,?2, ,?? be an orthogonal basis for a subspace W, and let u be a vector in W. ? = ?1?1+ ?2?2+ + ???? ? ?1 ?1 ? ?2 ?2 ? ?? ?? 2 2 2 Let u be any vector, and w is the orthogonal projection of u on W. ? = ?1?1+ ?2?2+ + ???? ? ?1 ?1 ? ?2 ?2 ? ?? ?? 2 2 2

  24. Orthogonal Projection Let ? = ?1,?2, ,?? be an orthogonal basis for a subspace W. Let u be any vector, and w is the orthogonal projection of u on W. ? = ?1?1+ ?2?2+ + ???? ? ?1 ?1 ? ?2 ?2 ? ?? ?? 2 2 2 ??= ? ??? 1?? ??= ?? 1?? ? ?1 ?2 ?? ? ? = ?1 ?? ??= Projected: ? = ?? 1??? ?

  25. Orthogonal Projection What is Orthogonal Complement What is Orthogonal Projection How to do Orthogonal Projection Application of Orthogonal Projection

  26. Solution of Inconsistent System of Linear Equations Suppose Ax = b is an inconsistent system of linear equations. b is not in the column space of A Find vector z minimizing ||Az b|| b ||Ax b|| ||Az b|| Ax 0 Az = PWb W = Col A

  27. Least Square Approximation data pairs: x1 y1 x2 y2 xi yi predict e.g. ( , ) ( PM2.5, PM2.5) Find the least-square line y = a0+ a1x to best fit the data Regression

  28. Least Square Approximation ? = ?0+ ?1? Error Vector: ??,?0+ ?1?? ?? ?0+ ?1?? ??,?? Find a0and a1minimizing E

  29. Least Square Approximation Error Vector: Find a0and a1minimizing E E = y (a0v1+ a1v2) 2= y Ca 2

  30. Least Square Approximation Find a minimizing E = y Ca 2 B = {v1,v2} Independent Ca is the orthogonal projection of y on W = Span B. find a such that Ca = PWy ?? = ? ??? 1??? ? = ??? 1???

  31. Example 1 y = 0.056 + 0.745x. Prediction: if the rough weight is 2.65, the finished weight is 0.056 +0.745(2.65) = 2.030. (estimation)

  32. Least Square Approximation Best quadratic fit: using y = a0+ a1x + a2x2to fit the data points (x1, y1), (x2, y2), , (xn, yn) 2 ?1 ?0+ ?1?1+ ?2?1 ?2 ?0+ ?1?2+ ?2?2 y = a0+ a1x + a2x2 2 ? = 2 ?? ?0+ ?1??+ ?2?? Find a0, a1and a2minimizing E

  33. Least Square Approximation Best quadratic fit: using y = a0+ a1x + a2x2to fit the data points (x1, y1), (x2, y2), , (xn, yn) 2 ?1 ?0+ ?1?1+ ?2?1 ?2 ?0+ ?1?2+ ?2?2 2 ? = 2 ?? ?0+ ?1??+ ?2?? Find a0, a1and a2minimizing E

  34. y = a0+ a1x + a2x2 ? = 101.00 + 29.77? 16.11?2 Best fitting polynomial of any desired maximum degree may be found with the same method.

  35. Multivariable Least Square Approximation ?? ??= ?0+ ?1??+ ?2?? ?? ?? http://www.palass.org/publications/newsletter/palaeomath-101/palaeomath- part-4-regression-iv

Related


More Related Content