Gradient Descent Regression Logistic Regression
Gradient Descent Regression Logistic Regression
Logistic Regression
Module2 : learning with Gradient Descent
module 2: numerical optimization
DATA PROBLEM REPRESENTATION LEARNING PERFORMANCE
• J(x) = (x − 2)
2
+ 1
and the initial
guess for a
minimum is x0 = 3
!
• GD iteration 1
• GD iteration 2
• GD iteration 3
regression goal
objective solution
• very unstable
• impractical for large matrices
• slow
• undesirable in cases with many outliers
Gradient Descent for linear regression
• objective = likelihood of
observations
- a.k.a how likely is the data
observed, given the
regression model
- and take the log
Logistic
!
Regression
!
!
• consider the likelihood of
observations
- and take the log
!
!
• maximize log likelihood using
gradient ascent
- one datapoint derivation
!
!
• write down the update rules
- batch or stochastic
!
!
!