
Learning Hypotheses in Machine Learning: Hebb's Rule and Synaptic Efficacy
Explore the foundational concepts of machine learning with a focus on Hebb's rule and synaptic efficacy. Understand how neural connections are reinforced and learn about limitations and improvements in learning patterns. Discover the implications of postsynaptic and presynaptic rules in artificial neural networks.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
? Machine Learning (Part II) Test Angelo Ciaramella
Question 2 First learning hypotheses Question The Hebb rule allows to learn only random patterns statistic patterns orthogonal patterns ML Verification tests .
Learning First learning hypotheses Donald O. Hebb 1949 Book titled: The organization of behavior neurophysiological evidence Principle If two connected neurons are simultaneously active, the synaptic efficacy of the connection is reinforced ML Verification tests 3
Hebbs rule y1 y2 y3 Synapses x1 x2 x3 x4 ML Verification tests ???=????? Learning rule ? learning rate 4
Hebbs algorithm Initialize the synaptic weights ???=0 Calculate synaptic changes ???=????? ML Verification tests Update the synaptic weights ???(?) = ???t 1 + ??? 5
Hebbs rule y1 y2 y3 Synapses x1 x2 x3 x4 ML Verification tests x y 1 0 0 1 1 0 0 0 1 0 0 0 1 0 0 0 10 0 0 1 Learning example 6
Considerations Limitations The Hebb rule allows to learn only orthogonal patterns Mixed responses are called interferences Some improvements Postsynaptic rule Presynaptic rule ML Verification tests 7
Postsynaptic rule Postsynaptic rule Stent-Singer neurophysiologicals that highlighted the mechanism in biological circuits rule increased when the postsynaptic and presynaptic units are active decreased when the postsynaptic unit is active but the presynaptic unit is inactive reduction of the interference phenomenon too many inhibitory synapses it is not found in biological systems but in all the artifcial neural networks ML Verification tests 8
Presynaptic rule Presynaptic rule increased when the postsynaptic and presynaptic units are active decreased when the presynaptic unit is active but the postynaptic unit is inactive It works well when many different and partially overlapping patterns need to be associated with the same pattern ML Verification tests 9
Postsynaptic rule ???=? ????+ ?? 1 ?? postsynaptic rule ???=? ????+ ?? 1 ?? ML Verification tests presynaptic rule 10
Hebbian learning and NNs NNs based on the Hebb s rule Hopfield network recurrent artificial NN described by Little in 1974 popularized by John Hopfield in 1982 content-addressable ( associative ) memory systems with binary threshold nodes They are guaranteed to converge to a local minimum converge to a false pattern (wrong local minimum) rather than the stored pattern (expected local minimum provide a model for understanding human memory ML Verification tests 11
Hebbian learning and NNs NNs based on the Hebb s rule Oja s rule Finnish computer scientist Erkki Oja Is a model of how neurons in the brain or in artificial neural networks change connection strength solves stability problems of Hebbian learning generates an algorithm for Principal Component Analysis (PCA) non-linear PCA Independent Component Analaysis (ICA) ML Verification tests 12
References Material Slides Video Lessons Books D. Floreano, Manuale sulle Reti Neurali, Il Mulino, 1996 J. C. Bishop, Pattern Recognition and Machine Learning, Springer, 2006 ML Verification tests