5 Optimizers
5 Optimizers
Optimization Algorithms
• An epoch refers to a single pass of the entire dataset through the model during the
training phase. In other words, during one epoch, the algorithm processes the entire
dataset exactly once.
• Iteration refers to the number of batches that are processed during each epoch. The
number of iterations required to complete an epoch depends on the batch size
chosen.
• E.g., if you have a dataset of 1000 rows, and you chose a batch size of 100, then
each iteration would process 100 rows, and it would take 10 iterations to complete
an epoch.
Gradient Descent Algorithm