0% found this document useful (0 votes)
36 views

Backpropagation - Numerical

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
36 views

Backpropagation - Numerical

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 31

So, we are trying to get the value of weight such that the error

becomes minimum. Basically, we need to figure out whether we


need to increase or decrease the weight value. Once we know
that, we keep on updating the weight value in that direction until
error becomes minimum. You might reach a point, where if you
further update the weight, the error will increase. At that time
you need to stop, and that is your final weight value.
Forward Propagation
Backward Propagation
outo1 = 1/(1+e-neto1)
1+e-neto1 = 1 /outo1
e-neto1 = (1 /outo1) – 1
= 1-outo1
outo1
Substituting this value -> 1-outo1 X outo12
outo1

= (1-outo1)outo1
Calculation at hidden layers
OR,
The above equations are put together to form expressions as given below:

With respect to output layer:


With respect to hidden layer:

Here, refers to
Similarly, we can calculate the other weight values as well.
After that we will again propagate forward and calculate the output. Again, we will
calculate the error.
If the error is minimum we will stop right there, else we will again propagate
backwards and update the weight values.
This process will keep on repeating until error becomes minimum.

https://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/

You might also like