0% found this document useful (0 votes)
39 views

Training by Error Backpropagation

Backpropagation is an algorithm used to train neural networks by calculating the error in the network's predictions and using that error to update weights in the network. It works by first making a prediction with random weights, calculating the error between that prediction and the correct output, then propagating this error backwards to adjust the weights, reducing error. This process of predicting, calculating error, and updating weights is repeated on all training examples until the network converges on weights that minimize error.

Uploaded by

amit
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
39 views

Training by Error Backpropagation

Backpropagation is an algorithm used to train neural networks by calculating the error in the network's predictions and using that error to update weights in the network. It works by first making a prediction with random weights, calculating the error between that prediction and the correct output, then propagating this error backwards to adjust the weights, reducing error. This process of predicting, calculating error, and updating weights is repeated on all training examples until the network converges on weights that minimize error.

Uploaded by

amit
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 6

Training by Error

backpropagation
ARTIFICIAL INTELLIGENCE
What is back propagation?
 Backpropagation is supervised learning algorithm , for training
Neural Networks.
  It is the method of fine-tuning the weights of a neural network based on the error rate
obtained in the previous iteration. Proper tuning of the weights allows you to reduce error
rates and make the model reliable by increasing its generalization.

 A Simple Neural Network Structure


How we used error Backpropagation?

 While designing a Neural Network, in the beginning, we initialize


weights with some random values or any variable for that fact.
 Now obviously, we are not superhuman. So, it’s not necessary that
whatever weight values we have selected will be correct, or it fits our
model the best.
 we have selected some weight values in the beginning, but our model
output is way different than our actual output i.e. the error value is
huge.
 Now, how will you reduce the error?
 Basically, what we need to do, we need to somehow explain the model
to change the parameters (weights), such that error becomes minimum.
According to this figure:

 A picture of a cat to the model. Since the model is not trained yet
,we don’t know which weights to use and not use . we have to initialize the
weights to some random value.

 Now, we get an output ‘dog’ instead of the expected output ‘cat’. The model calculates the
difference between the results for ‘cat’ and the results that got us ‘dog’. This is the error. It then
propagates this error backwards into the model updating all the weights in proportion to the error.
 And we repeat the process all over again until the error is minimized and the model is trained.
Once the error becomes minimum, your model is ready to work with new unseen data!
 Model to train backpropagation
 Diagram

• Calculate the error – How far is your model output from the actual output.
• Minimum Error – Check whether the error is minimized or not.
• Update the parameters – If the error is huge then, update the parameters (weights
and biases). After that again check the error. Repeat the process until the error
becomes minimum.
• Model is ready to make a prediction – Once the error becomes minimum, you can
feed some inputs to your model and it will produce the output.
Global loss minimum
  we are trying to get the value of weight such that the error becomes minimum.
Basically, we need to figure out whether we need to increase or decrease the
weight value. Once we know that, we keep on updating the weight value in that
direction until error becomes minimum. You might reach a point, where if you
further update the weight, the error will increase. At that time you need to stop,
and that is your final weight value.

 consider a graph:

 We Need to reach global loss minimum.

 This is how to train backpropagation.

You might also like