Class 2
-
Overview of Algorithms
-
Importance of convex functions and sets for global optima
-
line search and trust region methods
-
line search: steepest descent, Newton's method (scale invariance)
-
quasi-Newton methods, trust region methods (comparisons)
-
scaling
-
rates of convergence
-
Line Search Methods,
of type
where pk is the search direction and
alphak is the step length
-
Step Length
-
Wolfe conditions (geometric interpretations)
sufficient decrease - step is not too large
curvature condition - step is not too small
-
existence of step lengths
-
Convergence of Line Search Methods
(cos theta geometric interpretation)
-
Rate of Convergence
-
linear convergence rate for steepest descent -
dependence on condition number of Hessian
-
Dennis-More condition - quasi-Newton method Hessian
approximate needs (asymptotic) convergence in search direction for
superlinear convergence
-
with proof! - quadratic convergence of Newton's method near
optimum.