Lecture 1.1 Gradient Descent Algorithm
Lecture 1.1 Gradient Descent Algorithm
𝑤 = 𝑤 − 𝜂𝛻𝐸(𝑤)
• 𝑤 : Model parameters (weights)
• 𝜂 : Learning rate
• 𝛻𝐸(𝑤): Gradient of the loss function 𝐸(𝑤) with
respect to 𝑤
• It can be also written as:
𝑁
1
𝑤 =𝑤−𝜂 𝑓 𝑥𝑖 − 𝑦𝑖 𝑥𝑖
𝑁
𝑖=1
Sl w 𝐸 𝑤
1 0 11
2 0.6 7.76