0% found this document useful (0 votes)
2 views

Lec3 Backpropagation

The document outlines the training steps for neural networks, focusing on weight adaptation through backpropagation. It explains the process of calculating derivatives to assess the impact of weight changes on prediction error and provides methods for updating weights using gradient descent. The content is structured to guide learners through the mechanics of neural network training and the mathematical principles involved.

Uploaded by

achraf allali
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Lec3 Backpropagation

The document outlines the training steps for neural networks, focusing on weight adaptation through backpropagation. It explains the process of calculating derivatives to assess the impact of weight changes on prediction error and provides methods for updating weights using gradient descent. The content is structured to guide learners through the mechanics of neural network training and the mathematical principles involved.

Uploaded by

achraf allali
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 13

Neural network and learning

machines
Backpropagation for updating weights
Instructor: Ahmed Yousry
Neural Network training steps
1 Weight Initialization

2 Inputs Application

3 Sum of inputs - Weights product

4 Activation functions

5 Weights Adaptations

6 Back to step 2
Regarding 5th step: Weights Adaptation
First method:

Learning Rate  0 ≤≤ 0 ≤α≤


1 1
Regarding 5th step: Weights Adaptation
Feedforward
Second method: Back propagation
 Fowrward VS Backword passes

Outputs
Inputs
The Backpropagation algorithm is a
sensible approach for dividing the
contribution of each weight.

Backward

Fowrward Input Prediction Prediction


SOP
weights Output Error

Prediction Prediction Input


backward SOP
Error Output weights
Backward pass

Let us work with a simpler example

𝒚 = 𝒙𝟐 z+ c
How to answer this question: What is the effect on the output Y
given a change in variable X?

This question is answered using derivatives. Derivative of Y wrt X ( 𝜕��ൗ𝜕𝑥 )


will tell us the effect of changing the variable X over the output Y.
Backward pass
Calculating the Derivatives

𝒚 = 𝒙𝟐 z+ c
The Derivative ( 𝜕��ൗ𝜕𝑥 ) can be calculated as follow
𝒙𝟐 z+

𝝏�
𝒙rules:
Based on these two derivative
c
𝝏
𝒙𝟐

𝝏� 𝝏
Square Constant c=0
𝒙 =2x 𝒙
The Result will be :

𝒙𝟐 z+

𝝏�
c=2zx+0=2zx
Backward pass
Calculating the Derivative of prediction Error wrt Weights

𝟏
𝑬 = �(𝒅𝒆𝒔𝒊𝒓𝒆𝒅 −
𝒑𝒓𝒆𝒅𝒊𝒄𝒕𝒆𝒅)𝟐
𝟏

𝒑𝒓𝒆𝒅𝒆𝒔𝒊𝒓𝒆𝒅 = 𝒑𝒓𝒆𝒅𝒊𝒄𝒕𝒆𝒅 =
𝟏+
𝑪𝒐𝒏𝒔𝒕𝒂𝒏𝒕 𝒇(𝒔) =
𝒆−𝒔
𝟏
𝟏
𝑬 =� 𝟏+ )𝟐

(𝒅𝒆𝒔𝒊𝒓𝒆𝒅
�m − 𝒆−𝒔
s   x i w ji 
j
1 1
E  ( d  b n
i
) 2
2
e 
 j x w
i ij
Regarding 5th step: Weights Adaptation
second method: Back propagation
 Backword pass
What is the change in prediction Error (E) given the change in weight (W) ?
Get partial derivative of E W.R.T W E
W

m
1
E  1 (d  y) 2 f (s) w1 , w 2
2 1 s s   x i w ji 

d (desired output) Const s e
(Sum Of Product SOP )
y ( predicted output) bi
j
1 1
E  (d  2
2
n )
 x iw
e j ij
Regarding 5th step: Weights Adaptation
second method: Back propagation
Chain Rule
 Weight derivative

1 1
E (d  y) 2
y  f (s) s x1 w1  x 2 w 2  b w1 , w 2
2  1 es

 E y s
,
E y s
sw1 
W
E E y s E w2
E y s
 x x  x x
 w1 y s  w1  w2 y s  w2
Regarding 5th step: Weights Adaptation
second method: Back propagation
 Weight derivative

E  1
y  y 2 ( d  y ) 2
y d

1 1
y  1  (1 )
s  s 1 e  s 1 e  s
s 1 e

s  s 
 xw
1 1 x
2 w2 b  x 1w1  x 2 w2  b  2
 w1  1 1  w2  2
x x
E 1 w 1
w (yd (1
) i s s ) xi
1 e 1 e
w
Regarding 5th step: Weights Adaptation
second method: Back propagation
 Update the Weights
In order to update the weights , use the Gradient Descent

f(w) f(w)

- slop
+ slop

w w
Wnew= Wold - (- Wnew= Wold -
ve) (+ve)
Example

Learning rate : 0.01

You might also like