
Fundamental to Artificial Intelligence Course Overview
Dive into the world of artificial intelligence with this comprehensive course covering single neuron computation, error and weight updating, back propagation algorithms, and more. Join Dr. Saman Mirza Abdullah to explore the core concepts of AI and enhance your understanding in Tishik-IT.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
Fundamental to Artificial Intelligent IT-456 Dr. Saman Mirza Abdullah Saman.mirza@tiu.edu.iq
Objectives The main of objectives for this class are Reviewing a single neuron with single and multiple inputs Presenting the Error and weight updating computation Artificial Intelligence - Tishik-IT 2
Review Single node iteration Use the following graph of an artificial neuron to calculate the errors and draw the error-iteration graph. (Feed Forward ) Given: 1. In = 2 2. Weight for first loop=0.1, for the second loop=0.5 3. Bias (b) = -0.2 4. Activate Function = 1+? ? 5. Goal = 0.01. 6. Desired output = 0.7 1 Artificial Intelligence - Tishik-IT 3
Back Propagation Input 1 Input 2 Desired OP Initial Weights 1 1 1 W1 W2 0.4 -0.1 in1 w1 Stop if error <0.001 Actual Op A.F w2 in2
in1 Back Propagation Algorithm Example w1 Actual Op A.F w2 Iteration 1: in2 1st Step: Forward Computation: in1=1 and in2 = 1 P = in1*w1 + in2*W2 P=1*0.4 + 1*(-0.1) Input 1 Input 2 Desired OP P = 0.4 0.1 = 0.3 1 1 1 OP = 1 / (1+ e-0.3) Initial Weights OP = 0.572 W1 W2 0.4 -0.1
in1 Back Propagation Algorithm Example w1 Actual Op A.F w2 Iteration 1: in2 1st Step: Forward Computation: 2- Computing Errors Error OP= OP*(1- OP)*( Desired OP OP) Error OP= 0.572 *(1- 0.572)*( 1- 0.572) =0.105 Input 1 Input 2 Desired OP Because error is greater than 0.001, we need to update the weights. 1 1 1 Initial Weights W1 W2 0.4 -0.1
in1 Back Propagation Algorithm Example w1 Actual Op A.F w2 Iteration 1: in2 2nd Step: Back Propagation at output layer: 1- Finding W1with W2 and updating W1 and W2 Input 1 Input 2 Desired OP W1= * Error OP * Desired OP W1=0.45 * 0.105* 1 W1=0.047 1 1 1 NewW1 = old W1+ W1+( * (t-1)) NewW1 = 0.4+0.047+(0.9* 0) NewW1 =0.447 Initial Weights W1 W2 0.4 -0.1
in1 Back Propagation Algorithm Example w1 Actual Op A.F w2 Iteration 1: in2 2nd Step: Back Propagation at output layer: 1- Finding W1with W2 and updating W1 and W2 W2= * Error OP * Desired OP W2=0.45 * 0.105* 1 W2=0.047 Input 1 Input 2 Desired OP 1 1 1 NewW2 = old W2+ W2+( * (t-1)) NewW2 = -0.1+0.047+(0.9* 0) NewW2 = -0.053 Initial Weights W1 W2 0.4 -0.1
in1 Back Propagation Algorithm Example w1 Actual Op A.F w2 Iteration 2: in2 1st Step: Forward Computation: in1=1 and in2 = 1 P = in1*w1 + in2*W2 P=1*0.447 + 1*(-0.053) Input 1 Input 2 Desired OP P = 0.447 0.053 = 0.394 1 1 1 OP = 1 / (1+ e-0.394) updated Weights OP = 0.597 W1 W2 0.447 -0.053
in1 Back Propagation Algorithm Example w1 Actual Op A.F w2 Iteration 2: in2 2nd Step: Back Propagation at output layer: 1- Computing Errors Error OP= OP*(1- OP)*( Desired O/P OP) Input 1 Input 2 Desired OP Error OP= 0.597*(1- 0.597)*( 1- 0.597) =0.0969 1 1 1 Because error is greater than 0.001, we need to update the weights. updated Weights W1 W2 0.447 -0.053
in1 Back Propagation Algorithm Example w1 Actual Op A.F w2 Iteration 2: in2 2nd Step: Back Propagation at output layer: 2- Finding W1with W2 and updating W1 and W2 W1= * ErrorOP * OP W1=0.45 * 0.0969* 1 W1=0.0426 Input 1 Input 2 Desired OP 1 1 1 NewW1 = old W1+ W1+( * (t-1)) NewW1 = 0.447+0.0426+(0.9* 0.047) NewW1 =0.5319 updated Weights W1 W2 0.447 -0.053
in1 Back Propagation Algorithm Example w1 Actual Op A.F w2 Iteration 2: in2 2nd Step: Back Propagation at output layer: 2- Finding W1with W2 and updating W1 and W2 W2= * ErrorOP * OP W2=0.45 * 0.0969* 1 W2=0.0426 Input 1 Input 2 Desired OP 1 1 1 NewW2 = old W1+ W1+( * (t-1)) NewW2 = -0.053+0.0426+(0.9* 0.047) NewW2 =0.0319 updated Weights W1 W2 0.447 -0.053
in1 Back Propagation Algorithm Example w1 Actual Op A.F w2 Iteration 3: in2 1st Step: Forward Computation: in1=1 and in2 = 1 P = in1*w1 + in2*W2 P=1*0.5319 + 1*0.0319 Input 1 Input 2 Desired OP P = 0.5319 + 0.0319 = 0.5638 1 1 1 Out = 1 / (1+ e-0.5638) updated Weights Out = 0.6384 W1 W2 0.5319 0.0319
in1 Back Propagation Algorithm Example w1 Actual Op A.F w2 2nd Step: Back Propagation at output layer: in2 1- Computing Errors Error OP= OP*(1- OP)*( Desired OP - OP) Error OP= 0.6384*(1- 0.6384*( 1- 0.6384) =0.083 Because error is greater than 0.001, we need to update the weights. Input 1 Input 2 Desired OP 1 1 1 updated Weights W1 W2 0.5319 0.0319
Graphing the Error with Iteration Errors 0.16 0.14 0.12 0.1 0.08 0.06 0.04 0.02 0 0 1 2 3 4 5 6 Artificial Intelligence - Tishik-IT 15
Class Ended 16 Artificial Intelligence - Tishik-IT