Deep Learning Concepts in AI for Medicine Lecture

ai for medicine n.w
1 / 60
Embed
Share

Explore the concepts of deep learning in AI for Medicine Lecture 21, focusing on computation graphs, forward propagation, and neural network computations. Dive into the flow of computations in neural networks and the significance of gradient descent. Get ready for the upcoming assignments and quizzes in this informative session.

  • Deep Learning
  • AI for Medicine
  • Computation Graphs
  • Neural Networks
  • Gradient Descent

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. AI for Medicine Lecture 21: Deep Learning Part II April 12, 2021 Mohammad Hammoud Carnegie Mellon University in Qatar

  2. Today Last Monday s Session: Deep Learning Part I Today s Session: Deep Learning Part II Announcements: Assignment 3 is due on Wednesday April 14 by midnight Quiz II is on April 19

  3. Outline Deep Learning Computation Graph Gradient Descent Overview Vectorization

  4. The Flow of Computations in Neural Networks The flow of computations in a neural network goes in two ways: 1. Left-to-right: This is referred to as forward propagation, which results in computing the output of the network 2. Right-to-left: This is referred to as back propagation, which results in computing the gradients (or derivatives) of the parameters in the network The intuition behind this 2-way flow of computations can be explained through the concept of computation graphs What is a computation graph?

  5. What is a Computation Graph? Let us assume we want to compute the following function ?: ? = ?? ? ?,?,? = ?(? + ??) ? ? = ? + ? ? ? = ?? ? ? = ? ? = ? Computation Graph ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ?

  6. Forward Propagation Let us assume we want to compute the following function ?: ? = ?? ? ?,?,? = ?(? + ??) ? ? = ? + ? ? ? = ?? ? ? = ? ? = ? Computation Graph ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ?

  7. Forward Propagation Let us assume we want to compute the following function ?: ? = ?? ? ?,?,? = ?(? + ??) ? ? = ? + ? ? ? = ?? ? ? = ? ? = ? Computation Graph ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ?

  8. Forward Propagation Let us assume we want to compute the following function ?: ? = ?? ? ?,?,? = ?(? + ??) ? ? = ? + ? ? ? = ?? ? ? = ? ? = ? Computation Graph ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ?

  9. Forward Propagation Let us assume we want to compute the following function ?: ? = ?? ? ?,?,? = ?(? + ??) ? ? = ? + ? ? ? = ?? ? ? = ? ? = ? Computation Graph ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? Forward propagation allows computing ?

  10. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? ?? ??= Derivative of ? with respect to ?

  11. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 1?.??? ??.??? ?? ??= If we change ? a little bit, how would ? change?

  12. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 1?.??? ??.??? ?? ??= ? To compute the derivative of ? with respect to ?, we went back to ?, nudged it, and measured the corresponding resultant increase on ?

  13. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? ?? ??= Derivative of ? with respect to ?

  14. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? 2.??? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 1?.??? ??.??? ?? ??= If we change ? a little bit, how would ? change?

  15. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? 2.??? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 1?.??? ??.??? ?? ??= ? =?? ?? ?? ??

  16. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? 2.??? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 1?.??? ??.??? ?? ??= ? =?? ?? ?? ?? The change in ? caused a change in ?

  17. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? 2.??? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 1?.??? ??.??? ?? ??= ? =?? ?? ?? ?? And the change in ? caused a change in ?

  18. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? 2.??? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 1?.??? ??.??? ?? ??= ? =?? ?? ?? ?? This is denoted as the chain rule in calculus

  19. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? 2.??? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 1?.??? ??.??? ?? ??= ? =?? ?? ?

  20. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? 2.??? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 1?.??? ??.??? ?? ??= ? = ? ?

  21. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? 2.??? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 1?.??? ??.??? ?? ??= ? =?? ?? ?? ?? In essence, to compute the derivative of ? with respect to ?, we had to go back to ?, nudge it a little bit, and measure the corresponding resultant increase on ?

  22. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? 2.??? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 1?.??? ??.??? ?? ??= ? =?? ?? ?? ?? Then, we had to go back to ?, nudge it a little bit, and measure the corresponding resultant increase on ?

  23. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? 2.??? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 1?.??? ??.??? ?? ??= ? =?? ?? ?? ?? Then, we multiplied the changes together (i.e., we applied the chain rule!)

  24. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? ?? ??= Derivative of ? with respect to ?

  25. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 12.??? 1?.??? ??.??? ?? ??= If we change ? a little bit, how would ? change?

  26. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 12.??? 1?.??? ??.??? ?? ??= ? =?? ??.?? ??

  27. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 12.??? 1?.??? ??.??? ?? ??= ? =?? ?? ?? ?? The change in ? caused a change in ?

  28. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 12.??? 1?.??? ??.??? ?? ??= ? =?? ?? ?? ?? And the change in ? caused a change in ?

  29. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 12.??? 1?.??? ??.??? ?? ??= ? =?? ?? ?

  30. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 12.??? 1?.??? ??.??? ?? ??= ? = ? ?

  31. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 12.??? 1?.??? ??.??? ?? ??= ? =?? ?? ?? ??

  32. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 12.??? 1?.??? ??.??? ?? ??= ? =?? ?? ?? ?? Same as before, we had to go back to ? then to ? in order to compute the derivative of ?

  33. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? ?? ??= Derivative of ? with respect to ?

  34. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? ?? ??= If we change ? a little bit, how would ? change?

  35. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? 4.??? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 12.??? 1?.??? ??.??? ?? ??= ?=?? ?? ?? ?? ?? ?? 1 3 3

  36. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? 4.??? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 12.??? 1?.??? ??.??? ?? ??= ? =?? ?? ?? ?? ?? ?? 1 3 3

  37. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? 4.??? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 12.??? 1?.??? ??.??? ?? ??= ? =?? ?? ?? ?? ?? ?? 1 3 3

  38. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? 4.??? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 12.??? 1?.??? ??.??? ?? ??= ? =?? ?? ?? ?? ?? ?? 1 3 3

  39. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? 4.??? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 12.??? 1?.??? ??.??? ?? ??= ? =?? ?? ?? ?? ?? ?? 1 3 3

  40. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? ?? ??= Derivative of ? with respect to ?

  41. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? ?? ??= If we change ? a little bit, how would ? change?

  42. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 3.??? 12.??? 1?.??? ??.??? ?? ??= ?=?? ?? ?? ?? ?? ?? 1 4 3

  43. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 3.??? 12.??? 1?.??? ??.??? ?? ??= ?? =?? ?? ?? ?? ?? ?? 1 4 3

  44. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 3.??? 12.??? 1?.??? ??.??? ?? ??= ?? =?? ?? ?? ?? ?? ?? 1 4 3

  45. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 3.??? 12.??? 1?.??? ??.??? ?? ??= ?? =?? ?? ?? ?? ?? ?? 1 4 3

  46. Backward Propagation Let us now compute the derivatives of the variables through the computation graph as follows: ? = ? ? = ? ? = ?? ? = ? + ? ? = ?? ?? 1? ?? ? = ? 3.??? 12.??? 1?.??? ??.??? ?? ??= ?? =?? ?? ?? ?? ?? ?? 1 4 3

  47. Outline Deep Learning Computation Graph Gradient Descent Overview Vectorization

  48. The Computation Graph of Logistic Regression Let us translate logistic regression (which is a neural network with only 1 neuron) into a computation graph ? ? ? = ??? + ? ?? ?? ? ?? ?? ? = ??? + ? ? ? = ?(?) ?(?,?) ? ?? ? ? ? ?? ? = ?(?) Where ? = ?, ? = ??,??,??, ? = ??,??,??, and ?(?,?) is the cost (or loss) function

  49. Forward Propagation The loss function can be computed by moving from left to right ? ? = ??? + ? ? = ??? + ? ? ? = ?(?) ? = ?(?) ?(?,?) ?(?,?) ?

  50. Backward Propagation The derivatives can be computed by moving from right to left ? ? = ??? + ? ? ? = ?(?) ?(?,?) ? ?? ??= ? ???? ? ? ? ??? ? ? Partial derivative of ? with respect to ? ??

Related


More Related Content