Efficient Gradient Computing: Understanding Backpropagation and Computational Graphs

computing gradient hung yi lee n.w
1 / 35
Embed
Share

Learn about computing gradients efficiently through backpropagation and understanding computational graphs. Dive into examples, chain rule reviews, and computation scenarios.

  • Gradient Computing
  • Backpropagation
  • Computational Graphs
  • Machine Learning

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Computing Gradient Hung-yi Lee

  2. Introduction Backpropagation: an efficient way to compute the gradient Prerequisite Backpropagation for feedforward net: http://speech.ee.ntu.edu.tw/~tlkagk/courses/MLDS_2015_2/Lecture/ DNN%20backprop.ecm.mp4/index.html Simple version: https://www.youtube.com/watch?v=ibJpTrp5mcE Backpropagation through time for RNN: http://speech.ee.ntu.edu.tw/~tlkagk/courses/MLDS_2015_2/Lecture/RNN %20training%20(v6).ecm.mp4/index.html Understanding backpropagation by computational graph Tensorflow, Theano, CNTK, etc.

  3. Computational Graph

  4. ? = ? ?,? Computational Graph b A language describing a function Node: variable (scalar, vector, tensor ) Edge: operation (simple function) ? a c ? = ? ? ? Example ? = ? ? ? = ? ? = ? ? x y u v ? ?

  5. Computational Graph Example: e = (a+b) (b+1) c = a + b d = b + 1 6 e e = c d c 3 d 2 +1 + a b 2 1

  6. Review: Chain Rule ( ) y ( ) x ( ) x z = y = z = h g f Case 1 dz= dz dy g h z y x dx dy dx x y z Case 2 ( ) s ( ) s ( ) s ( ) y z = x = y = = f g h , z k x x x dz z dx z dy = + s z s z ds x ds y ds y y

  7. Computational Graph e e c e d = + Example: e = (a+b) (b+1) b c b d b ?? ?? Compute =1x(b+1)+1x(a+b) e ?? ?? ?? ?? =d =c Sum over all paths from b to e =a+b =b+1 c d +1 ?? ?? ?? ?? + ?? ?? =1 =1 =1 a b

  8. Computational Graph e e c e d = + Example: e = (a+b) (b+1) b c b d b ?? ?? Compute 15 e a = 3, b = 2 ?? ?? ?? ?? =d =c 5 3 c 5 d 3 +1 ?? ?? ?? ?? + ?? ?? =1 =1 =1 a b 3 2

  9. Computational Graph Example: e = (a+b) (b+1) ?? ?? ?? ?? =8 Compute Compute e 8 a = 3, b = 2 ?? ?? ?? ?? =d =c5 3 c 1 d 1 +1 ?? ?? ?? ?? + ?? ?? =1 =1 =1 a b 1 Start with 1

  10. Computational Graph Example: e = (a+b) (b+1) ?? ?? Compute =3 e 3 a = 3, b = 2 ?? ?? ?? ?? 3 =d =c 5 c 1 d +1 ?? ?? ?? ?? + ?? ?? =1 =1 =1 a 1 b Start with 1

  11. Computational Graph Reverse mode Example: e = (a+b) (b+1) What is the benefit? e e Start with 1 e 1 Compute and a b ?? ?? ?? ?? 3 =d =c 5 a = 3, b = 2 d 5 c 3 +1 ?? ?? ?? ?? e + ?? ?? =1 =1 =1 e a 3 b 8 a b

  12. Computational Graph Parameter sharing: the same parameters appearing in different nodes ?? ??=? ??2+ ? ??2 2? ?? ??= ? ??2 ? ? = ???2 ? x u ? exp ? = ??2 ? y v ?? ??= ? ??2 ? ? = ??2 x x ?? ??= ??2

  13. Computational Graph for Feedforward Net

  14. Review: Backpropagation l i C z C = l ij l ij l i w w z Error signal j 1 l 1 a l 1 l Layer l Layer i l = 1 x l 1 1 j 2 2 Backward Pass ( ) z = ( z = Forward Pass x W + = ( ) z = 1 1 1 z b ) ( L L C y ) 1 1 a T j 1 1 L L L L i W l l ij iz w ( ) ( z ) = + 1 1 2 1 l l l l z W a b T + + = 1 1 l l l l W ( ) = 1 1 l l a z

  15. Review: Backpropagation l i z C C = l ij l ij l i w w z Layer L Layer L-1 1 - L Error signal C y L i C y l 1 ( ) z1 1 ( ) 1 L L 1 1 z C Backward Pass ( ) z = ( z = 2 ( 2 2 y ( ) z2 ) ( ) L L C L 2 1 L z y ) T m ( 1 1 L L L L W C n ( ) n y T WL ( ) ( z ) ) ( ) T L 1 + + = m z 1 1 l l l l W L nz (we do not use softmax here)

  16. Feedforward Network bL b1 + W1 + ? b2 WL + x W2 = ? ? y ?2= ?2?1+ ?2 ?1= ?1? + ?1 z2 y x z1 a1 ? ? W1 W2 b2 b1

  17. Loss Function of Feedforward Network ? = ? ?, ? ? ? ?2= ?2?1+ ?2 ?1= ?1? + ?1 z2 y x z1 a1 ? ? W1 W2 b2 b1

  18. Gradient of Cost Function ? = ? ?, ? To compute the gradient Computing the partial derivative on the edge Using reverse mode ? ? Output is always a scalar z2 y x z1 a1 ? ? ?? ??1 ?? ??2 vector W1 W2 ??1 ??1=? ?? ??2 ?? ??1 b2 b1 vector

  19. Jacobian Matrix ?1 ?2 ?3 ?1 ?2 ? = ? ? ? = ? = ?? ??= ??1??1 ??2??1 ??1??2 ??2??2 ??1??3 ??2??3 size of y size of x Example ?1 ?2 ?3 ?1+ ?2?3 2?3 ?? ??=1 ?3 0 ?2 2 = ? 0

  20. Gradient of Cost Function ? = ? ?, ? 1 y 1y 0 0 1 ? ? 2 y Last Layer 2y ?? ?? ? = C ? iy iy ?? ??? y =? ? = ?: ? = ????? Cross Entropy: ?? ???= 1/?? ?? ??= 0 1/?? ? ?: ?? ???= 0

  21. Gradient of Cost Function ? = ? ?, ? ?? ??2is a Jacobian matrix square ? ? ? ?2 ?? ?? 2 ?????? i-th row, j-th column: ?? ??2 0 2 = ? ?: ?????? z2 y 2 2 = ? ?? ? = ?: ?????? ? 2 ? ?? 2 ? ??= ? ?? Diagonal Matrix How about softmax? 2 ?? ?

  22. ??2 ??1 is a Jacobian matrix ?1 ?2 ? = ? ?, ? i-th row, j-th column: ? ? 2 1= ??? ??? ??? 2 ?? ?? ?2= ?2?1 ??2 ??1 ?? ??2 2= ??1 2?1 2?2 + ??? a 1+ ??2 1+ ?? z2 y a1 ? 2?? 1 ??2 ??2 1 l l l z a b ?2 W2 1 1 1 l l w w 1 l 2 l 2 l 2 11 12 z b l 21 l 22 w w = + 1 l i l i l z a b i

  23. mxn 2 ??? ???? ??2 ??2= (j-1)xn+k 2 m i ? = ? ?, ? 2 2= 0 2 2=? ??? ???? ??? ???? ??? ???? ? ? ? ?: ?? ?? 2 2= ?? ?2= ?2?1 1 ? = ?: ?? ??2 ?2 2= ??1 2?1 2?2 + ??? a 1+ ??2 1+ ?? z2 y a1 n ? 2?? 1 m ??2 ??2 1 l l l z a b W2 1 1 1 l l Considering W2 as a mxn vector w w 1 l 2 l 2 l 2 11 12 z b l 21 l 22 w w mxn = + ??2??2 as a 1 l i l i l z a b (considering tensor makes thing easier) i

  24. mxn 2 ??? ???? ??2 ??2= (j-1)xn+k 2 m i ? = ? ?, ? 2 2= 0 2 2=? ??? ???? ??? ???? ??? ???? ? ? ? ?: ?? ?? 2 2= ?? ?2= ?2?1 1 ? = ?: ?? ??2 ?2 ? = 1 ? = 2 ? = 3 z2 y a1 n ? m i=1 ? ??2 ??2 0 ? i=2 W2 Considering W2 as a mxn vector ? ? mxn 0 ? ?

  25. ?? ???? ?? ?? ??1 ??1 ??1 ??1 ?? ??1= ?? ??2 = [ ] ?2 1 ? = ? ?, ? ? ? ?? ?? ??1 ??1 ?? ??2 ?1 ?2 z2 y x z1 a1 ? ? ??1 ??1 ??2 ??2 W1 W2 ?? ??1

  26. Question Q: Only backward pass for computational graph? Q: Do we get the same results from the two different approaches?

  27. Computational Graph for Recurrent Network

  28. Recurrent Network y1 y2 y3 h0 f h1 f h2 f h3 x1 x2 x3 ??, ?= ? ??, ? 1;??,? ,?? ?= ? ????+ ? ? 1 ??= ??????? ?? ? (biases are ignored here)

  29. ??,?= ? ??,?1;??,?,?? ??= ? ????+ ? ? 1 Recurrent Network ??= ??????? ?? ? Wo o1 y1 ??????? a0 m1 z1 h1 + ? Wh n1 x1 Wi

  30. ??,?= ? ??,?1;??,?,?? ??= ? ????+ ? ? 1 Recurrent Network ??= ??????? ?? ? ??= ??????? ?? ? y1 Wo h0 h1 Wh ?= ? ????+ ? ? 1 x1 Wi

  31. ? = ?1+ ?2+ ?3 Recurrent Network C ?2 ?3 ?1 C2 C3 C1 y1 y2 y3 Wo Wo Wo h0 h1 h2 h3 Wh Wh Wh x1 Wi x2 Wi x3 Wi

  32. ? = ?1+ ?2+ ?3 Recurrent Network C 1 1 1 ?2 ?3 ?1 C2 C3 C1 ??2??2 ??3??3 ??1??1 y1 y2 y3 Wo Wo Wo ??2? 2 ??3? 3 ??1? 1 ? 3? 2 ? 2? 1 h0 h1 h2 h3 ? 1?? ? 3?? ? 2?? Wh Wh Wh ?? ?? ?? ?? ?? ?? x1 Wi x2 Wi x3 Wi

  33. ??3 ??3 ??3 ? 3 ? 3 ? 2 ? 2 ? 1 ? 1 ?? ??2 ??2 ??2 ? 2 ? 2 ? 1 ??1 ??1 ??1 ? 1 ?? ?? + 1 + = ?? ?? ?? ?? ?? ?? ?? ?? ?? ?? 1 2 2 3 + = = + ?? ?? 3 = 1 2 3

  34. Reference Textbook: Deep Learning Chapter 6.5 Calculus on Computational Graphs: Backpropagation https://colah.github.io/posts/2015-08- Backprop/ On chain rule, computational graphs, and backpropagation http://outlace.com/Computational-Graph/

  35. Acknowledgement

More Related Content