Training by Error Backpropagation
Training by Error Backpropagation
backpropagation
ARTIFICIAL INTELLIGENCE
What is back propagation?
Backpropagation is supervised learning algorithm , for training
Neural Networks.
It is the method of fine-tuning the weights of a neural network based on the error rate
obtained in the previous iteration. Proper tuning of the weights allows you to reduce error
rates and make the model reliable by increasing its generalization.
A picture of a cat to the model. Since the model is not trained yet
,we don’t know which weights to use and not use . we have to initialize the
weights to some random value.
Now, we get an output ‘dog’ instead of the expected output ‘cat’. The model calculates the
difference between the results for ‘cat’ and the results that got us ‘dog’. This is the error. It then
propagates this error backwards into the model updating all the weights in proportion to the error.
And we repeat the process all over again until the error is minimized and the model is trained.
Once the error becomes minimum, your model is ready to work with new unseen data!
Model to train backpropagation
Diagram
• Calculate the error – How far is your model output from the actual output.
• Minimum Error – Check whether the error is minimized or not.
• Update the parameters – If the error is huge then, update the parameters (weights
and biases). After that again check the error. Repeat the process until the error
becomes minimum.
• Model is ready to make a prediction – Once the error becomes minimum, you can
feed some inputs to your model and it will produce the output.
Global loss minimum
we are trying to get the value of weight such that the error becomes minimum.
Basically, we need to figure out whether we need to increase or decrease the
weight value. Once we know that, we keep on updating the weight value in that
direction until error becomes minimum. You might reach a point, where if you
further update the weight, the error will increase. At that time you need to stop,
and that is your final weight value.
consider a graph: