Lecture10v01 Descent2
Lecture10v01 Descent2
Generic descent algorithm Generalization to multiple dimensions Problems of descent methods, possible improvements Fixes Local minima
f
f(x) guess
f(m) m
f
f(x) guess next step
f(m) m
f
f(x) guess next step new gradient
f(m) m
f
f(x) guess
f
f(x) guess
f(m)
stop m
guess
Direction: downhill
step
guess = x
direction = -f(x) step = h > 0 x:=xhf(x) f(x)~0
This is just a genaralization of the derivative in two dimensions. This can be generalized to any dimension.
guess
Direction: downhill
10
step
11
guess = x
direction = -f(x) step = h > 0 x:=xh Vf(x)
Vf(x)~0
12
Multiple dimensions
Everything that you have seen with derivatives can be generalized with the gradient.
in N dimensions.
Stock N
Stock i
Stock 2
If you want to minimize the price to buy your portfolio, you need to compute the gradient of its price:
Stock 1
13
f(x)
guess
f(m)
stop m
f(x)
f(m) m
Current point
14
[S. Boyd, L. Vandenberghe, Convex Convex Optimization lect. Notes, Stanford Univ. 2004 ]
[S. Boyd, L. Vandenberghe, Convex Convex Optimization lect. Notes, Stanford Univ. 2004 ]
15
[S. Boyd, L. Vandenberghe, Convex Convex Optimization lect. Notes, Stanford Univ. 2004 ]
In multiple dimensions:
Or equivalently
Rarely used in practice. More about this in EE227A (convex optimization, Prof. L. El Ghaoui).
16
Fixes
Several methods exist to address this problem
- Line search methods, in particular - Backtracking line search - Exact line search - Normalized steepest descent - Newton steps
17
18