Understanding Mathematical Regularization Techniques

singular value decomposition n.w
1 / 37
Embed
Share

Explore the concepts of singular value decomposition, highly parameterized inversion, regularization goals, manual regularization, and more in the realm of mathematical regularization techniques for solving inverse problems effectively.

  • Regularization
  • Mathematical Techniques
  • Inverse Problems
  • Singular Value Decomposition
  • Parameterization

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Singular Value Decomposition

  2. PEST: the book www.pesthomepage.org

  3. Highly parameterized inversion Under predictive conditions: Under calibration conditions: s = w wtk k h h = Zk Zk +

  4. Highly parameterized inversion h h = Zk Zk + = + h h k k Z Z

  5. A reminder: What is the goal of regularization? To obtain a solution to the inverse problem that is: 1. unique 2. results in a parameter field of minimized error variance 3. (ideally) renders all predictions made by the model to be of minimized error variance

  6. Manual regularization

  7. Define a parsimonious parameter field p through the equation p1 k k = Np Np p2 h = h = Zk Zk + + 1 0 0 0 1 0 0 0 1 0 0 0 . 0 1 0 0 0 1 0 0 0 1 0 0 . 0 0 1 0 0 0 1 0 0 0 1 0 . 0 0 0 1 0 0 0 1 0 0 0 1 h = h = ZNp ZNp + + p3 = h = h = Xp Xp + + p p k k N N

  8. h h = Xp Xp + = + X X p p h h

  9. History-matching workflow for a well-posed inverse problem From another video h h = Xp Xp + Q Q = r2C-1( ) ) = (wiri)2 = = r rtQr Xp) )tQ Q(h = Qr = (h = (h Xp (h- -Xp Xp) ) p p = (X XtQX QX) )-1X XtQh Qh r2 = min/(n m) C(p p p p) = r2(X XtQX QX) )-1

  10. Mathematical regularization

  11. Two basic types of mathematical/numerical regularization: 1. Tikhonov 2. Singular value decomposition

  12. Tikhonov Regularization h h = Zk Zk + = + Information on parameters and/or on relationships between parameters

  13. Singular Value Decomposition h h = Zk Zk + = +

  14. Singular Value Decomposition h h = Zk Zk + = + 1. Determine what combinations of parameters are estimable 2. Estimate only those parameter combinations 3. Leave the values of inestimable parameter combinations unchanged

  15. Mathematical/Numerical Regularization Remarks Both Tikhonov regularization and SVD can provide a solution of minimum error variance to the inverse problem if certain conditions are met. Which to use? Normally both together SVD for numerical stability and Tikhonov to pursue a solution of minimum error variance in more flexible ways.

  16. The Null Space

  17. Equations are derived on a tablet

  18. Singular Value Decomposition

  19. Basic Equation h h = Zk Zk + = +

  20. Singular Value Decomposition Z = Z = USV USVt = V Vt S S Z Z U U

  21. The U U matrix u u1u u2u u3 etc Orthonormal matrix etc U UtU U = UU UUt = I I U U3 U U2 U U1 spanning model output space Orthogonal unit vectors .

  22. Singular Value Decomposition Z = Z = USV USVt = V Vt S S Z Z U U

  23. The V V matrix Z = Z = USV USVt V Vt V V

  24. The V V matrix v v1v v2v v3 etc Orthonormal matrix v v3 etc V VtV V = VV VVt = I I k k3 k k1 k k2 v v1 v v2 spanning parameter space Orthogonal unit vectors .

  25. Singular Value Decomposition Z = Z = USV USVt = V Vt S S Z Z U U

  26. The S S matrix Off diagonal elements are zero.

  27. The S S matrix Elements are positive, decreasing down the diagonal until they become zero or until the diagonal runs out. 0 0

  28. Singular Value Decomposition V Vt S S Z Z U U = V V v v vectors span solution space v v vectors span null space

  29. Orthogonal vector spaces v3 Null space v1 v2 Solution space

  30. Singular Value Decomposition V Vt S S Z Z U U = S S1 S S2 Z = Z = USV USVt t ?1 ?2 ? = ? ?1 ?2 t V V Z = US Z = US1V V1t + US US2 2V V2 2t V V1 V V2 Z = US Z = US1V V1t

  31. Minimum Error Variance Solution to Inverse Problem

  32. Equations are derived on a tablet

  33. Conclusions

  34. Benefits of Singular Value Decomposition Solution of the inverse problem will never encounter a matrix that cannot be inverted. So the solution process will always be numerically stable. k k = V V1S S1-1U Uth h

  35. Problem with Singular Value Decomposition To guarantee minimum error variance, its use should be preceded by a Karhunen-Lo ve transformation EFEt C C(k k) = EFE j j = F- E Etk k where Hence we estimate basis functions which are the columns of E E. These are eigencomponents of the covariance matrix of k k. But what if we are wrong about C(k) (as we always will be)? How can we learn something about spatial heterogeneity from the inversion process?

  36. Solutions to these problems Use singular value decomposition to calculate k Use Tikhonov regularization to formulate inverse problem

  37. The End

More Related Content