Class 1
-
Introduction (Chapter 1, pgs 1-4,6,8)
-
Mathematical Formulation: example, level sets
-
Definitions: local and global optima
-
Definitions: convexity
-
Fundamentals of Unconstrained Optimization (Chapter 2 - complete chapter
except for R-Rates of Convergence)
-
Example of nonlinear least squares data fitting
-
Definitions: gradient, Hessian, Taylor's Theorem
Class 2, C&O 466/666
Class 2
-
Fundamentals Unconstrained Opt. cont... (Chapter 2 - complete chapter
except for R-Rates of Convergence) pgs 11-17,19-24,26-29
-
Recognizing Solutions
-
Definitions: gradient, Hessian, Taylor's Theorem, order notation (big
and little O)
-
directional derivative, curvature, linear model, direction of steepest
descent
-
first and second order necessary optimality conditions
-
second order sufficient optimality conditions
-
convexity and global minimima, characterizations of convex functions
- Application
-
Prove the arithmetic-geometric mean (AGM) inequality using unconstrained
minimization
Class 3, C&O 466/666
Class 3, C&O 466/666
-
Optimality Conditions Using Tangent Cones and Lagrange Multipliers
(Section 12.3, pgs 331-342.)
-
First order Geometric Optimality Conditions:
the gradient is in the polar of the tangent cone
-
First order Analytic Optimality Conditions:
Elementary Lagrange Multiplier Theorem for equality constraints
-
Overview of Algorithms (Chap. 3, pgs 35-40,42-53, Chap. 4, pgs 65-69)
-
Importance of convex functions and sets for global optima
-
line search and trust region methods
-
line search: steepest descent, Newton's method (scale invariance)
Class 4, C&O 466/666
Class 4, C&O 466/666
-
Overview of Algorithms continued...(( Chap. 4, pgs
65-69)
-
trust region methods (comparisons)
-
scaling
-
rates of convergence
Class 5, C&O 466/666
Class 5, C&O 466/666
-
Line Search Methods (Chapter 3),
of type
where pk is the search direction and
alphak is the step length
-
Step Length
-
Wolfe conditions (geometric interpretations)
sufficient decrease - step is not too large
curvature condition - step is not too small
-
Lemma 3.1 (existence of step lengths)
-
Convergence of Line Search Methods
-
Theorem 3.2 (cos(theta) geometric interpretation)
Class 6, C&O 466/666
Class 6, C&O 466/666
-
Line Search Methods continued ...(Chapter 3)
-
Example
minimize g(x) = sumim log(
e [aiTx - bi]
+e [-aiTx + bi] )
using Newton's method with backtracking. See the
matlab program.
-
Rate of Convergence
-
Theorems 3.3,3.4 (linear convergence rate for steepest descent -
dependence on condition number of Hessian)
-
Theorem 3.5 (Dennis-More condition - quasi-Newton method Hessian
approximate needs (asymptotic) convergence in search direction for
superlinear convergence)
-
Theorem 3.7 (with proof! - quadratic convergence of Newton's method near
optimum.)
Reference (for your interest only):
CMP 1 789 601 (2001:03) 90C30 (65K05),
Shi, Yixun(1-BLBS-CS),
Globally convergent algorithms for unconstrained optimization. (English.
English summary)
Comput. Optim. Appl. 16 (2000), no. 3, 295--308
Class 6b, C&O 466/666
Class 6b, C&O 466/666
-
(not responsible for this for exam)
A Survey of the Trust Region Subproblem Within a Semidefinite
Programming Framework, Seminar at University of Waterloo,
i.e. an introduction to trust region methods; but, concentrating on
duality and the trust region subproblem.
Class 7, C&O 466/666
Class 7, C&O 466/666
-
Discussion of the derivative as a linear operator
-
Line Search Methods continued ...(Chapter 3)
-
Theorem 3.7 (with proof! - quadratic convergence of Newton's method near
optimum.)
-
Trust Region Methods (Chapter 4)
-
basic outline,
Class 9, C&O 466/666
Class 9, C&O 466/666
-
Conjugate Direction Methods (Chap. 5, pgs 101-110)
-
Solving Ax=b for A pos. def. Expanding subspace Theorem 5.2 with proof.
Class 10, C&O 466/666
Class 10, C&O 466/666
-
Conjugate Gradient Methods (Chap. 5, pgs 101-110, 120-122)
-
Solving Ax=b for A pos. def. Practical form. Rates of convergence
(clustering eigenvalues). Preconditioning.
-
Outline of FR and PR nonlinear conjugate gradient methods
Class 11, C&O 466/666
Class 11, C&O 466/666
-
Gauss-Newton Method (Chap. 10, pgs 251-262)
-
Gauss-Newton method for nonlinear least squares: properties of GN
direction, Theorem 10.1, convergence rate
Class 12, C&O 466/666
Class 12, C&O 466/666
-
Nonlinear Equations (Chap. 11, pgs 276-286, 292-298, 304-310)
-
Newton's Method, derivation using linearization, Theorem 11.2 WITH
PROOF, convergence rate, Inexact Newton method, merit functions
-
Continuation (Homotopy) methods, (general outline only)
Class 13, C&O 466/666
Class 13, C&O 466/666
-
Theory of Constrained Optimization, (Chap. 12 - all)
-
Definitions: constrained program, feasible set Omega, tangent cone of Omega at
x, polar cone (dual cone).
- Geometric necessary conditions of optimality (extension of Fermat's
theorem) for min f(x) x in Omega
- Geometric characterization of optimality for the convex case,
min f(x) x in Omega , where f convex function and Omega convex set
Class 13A, C&O 466/666
Class 13A, C&O 466/666
-
Theory of Constrained Optimization, Chap. 12, cont....
-
Definitions: constrained program, feasible set Omega, tangent cone of Omega at
x, polar cone (dual cone).
- Geometric necessary conditions of optimality (extension of Fermat's
theorem) for min f(x) x in Omega
- Geometric characterization of optimality for the convex case,
min f(x) x in Omega , where f convex function and Omega convex set
Class 14, C&O 466/666
Class 14, C&O 466/666
-
Theory of Constrained Optimization, Chap. 12, cont...
-
Definitions: local vs global minimum, smoothing problems by adding
constraints, active constraints,
- tangent and linearizing cones and cone of gradients
- Lemma: K= 2nd polar of K if and only if K is a c.c.c.
- first and second order optimality conditions (2nd order conditions
WITHOUT proofs)
- constraint qualifications: weakest CQ; linear independence CQ;
Mangasarian-Fromovitz CQ
Class 15, C&O 466/666
Class 15, C&O 466/666
-
Theory of Constrained Optimization, Chap. 12, cont...
(with basic convex analysis added)
-
Basic Hyperplane Separation Theorem (with proof)
-
Lemma: K is a closed convex cone if and only K=K++
(for derivation of Farkas' Lemma)
-
Linearizing Cone, Cone of Gradients, Weakest Constraint Qualification,
Karush-Kuhn-Tucker conditions
- Second order conditions (section 12.4 - without proofs)
Class 16, C&O 466/666
Class 16, C&O 466/666
-
Interior-point methods for LP and SDP (Chap. 14, pgs 395-409)
-
Detailed algorithm with an example of the SDP max-cut relaxation
Class 17, C&O 466/666
Class 17, C&O 466/666
-
Theory of Constrained Optimization, Chap. 12, cont...
(with basic convex analysis/duality )
-
generalized Farkas' Lemma (with respect to cone ordering - with examples e.g.
semidefinite programming)
-
examples, e.g. quadratic programming
Class 18, C&O 466/666
Class 18, C&O 466/666
-
Fundamentals of Algorithms, Chap. 15
-
problem classes, pgs 421-425
-
elimination of variables, pgs 426-428
-
merit functions, pgs 434-437
Class 19, C&O 466/666
Class 19, C&O 466/666
-
Quadratic Programming, Chap. 16
-
Example: Portfolio Optimization, pgs 442-443
-
Equality Constraints, pgs 443-447
-
Inequality Constraints, pgs 453-466
-
Duality, pgs 484-485
Class 20, C&O 466/666
Class 20, C&O 466/666
-
Penalty, Barrier, and Augmented Lagrangian Methods (Chap. 17)
-
Quadratic Penalty Method
- pgs 492-500, i.e. description, example,
algorithmic framework, Theorems 17.1 (with proof) and 17.2
(without proof) .
-
discussion on ill-conditioning on pg 499 (equations 17.15, 17.16)
-
Outline of Logarithmic Barrier Method, Pages 500-513
Class 21, C&O 466/666
Class 21, C&O 466/666
-
Logarithmic Barrier Method cont....
-
Examples 17.2 and 17.3 in the text
-
Algorithm Framework 17.2
-
Handling Equality Constraints, pgs 509-510
-
Augmented Lagrangian Methods, pgs 513-516
-
Properties, pgs 518-521, (Theorem 17.5 with proof, Theorem 17.6)
Class 22, C&O 466/666
Class 22, C&O 466/666
-
Sequential Quadratic Programming, Chap 18, pgs ?????