MAT6007 - Session8 - Gradient Descent
MAT6007 - Session8 - Gradient Descent
Gradient Descent
The formula below sums up the entire Gradient Descent algorithm in a single line
Devise an algorithm to locate the minima, and that algorithm is called Gradient Descent
If you decide which way to go, you might take a bigger step or a little
step to reach your destination
The idea is that by being able to compute the derivative/slope of the function, find the minimum of a function
Understanding the Mathematics behind Gradient
The Learning rate Descent
This size of steps taken to reach the minimum or bottom is called Learning Rate.
Derivatives
Chain Rule
Power Rule
Understanding the Mathematics behind Gradient
Calculating Gradient Descent Descent
apply these rules of calculus in our original equation and find the derivative of the Cost
Function w.r.t to both ‘m’ and ‘b’.
Calculate the gradient of Error w.r.t to both m and b
Find the local minima of the function y=(x+5)² starting from the point x=3
https://
gist.github.com/rohanjoseph93/
ecbbb9fb1715d5c248bcad0a7d
3bffd2#file-gradient_descent-ip
ynb
Reducing Loss: Gradient Descent | Machine Learning Crash Course (google.com)
https://
towardsdatascience.com/implement-gradient-descent-in-python-9b93ed7108d1