Continuous Optimization - 466/666 - Winter 1997
This course provides a rigorous up-to-date treatment of topics in
Continuous Optimization (Nonlinear Programming). This includes a
hands-on approach with exposure to existing software packages.
Instructor:
Professor Henry Wolkowicz,
MC 6065, ext. 5589, henry@orion.math.uwaterloo.ca
Office Hours:
-
Henry Wolkowicz, MC6065: M 12:30-1:30, R 3-4
Lectures:
Text:
References:
Additional References:
-
Practical Optimization,
P.E. Gill, W. Murray, M.H. Wright.
-
Numerical Methods for Unconstrained Optimization and
Nonlinear Equations, J.E. Dennis and R.B. Schnabel
-
Nonlinear Programming: theory and algorithms, M.S. Bazaraa, C.M. Shetty, Sherali
-
Linear and Nonlinear Programming, D.G. Luenberger, Second Edition
-
Nonlinear Programming, Peressini, Sullivan, Uhl
-
Practical Methods of Optimization, R. Fletcher
-
Mathematical Programming Methods, G, Zoutendijk
-
Nonlinear Programming, G.P. McCormick
-
Mathematical programming : theory and algorithms,
M. Minoux (QA402.5.M5613)
-
Optimization by Vector Space Methods, D.G. Luenberger
-
Convex Analysis, R.T. Rockafellar
-
Theory of Extremal Problems, A.D. Ioffe and V.M. Tihomirov
Home Page, C&O 466/666,
ftp://orion.uwaterloo.ca/pub/henry/teaching/w97/666.w97/readme.html
Term Work:
will consist of homework problems
.
(See attached schedule of assignments.)
Final Exam:
A 3-hour exam, scheduled by the registrar.
(Please see the detailed course outline for topics covered during the
semester.)
Marking Scheme:
-
Homework............ 50%
-
Final Exam.......... 50%
- Major Topics
- Sub-topics
- Resources
for the different topics are included.
-
Unconstrained Optimization ( Chapter 1)
-
Optimality Conditions
first and second order; proofs; convex case
-
Gradient Methods - Convergence
Steepest Descent; stepsize rules; rates of convergence; Armijo rule with
proof; concepts of sufficient decrease and gradient related; behaviour
on quadratic functions
-
Newton's Method and Variations
(outline of Conjugate Gradient, Quasi-Newton, and Trust Region Methods);
details of proof of convergence, with rates, for Newton's method;
steepest descent convergence on quadratics
-
Least Squares Problems
-
Optimization Over a Convex Set (Chapter 2)
-
Optimality Conditions
-
Feasible Directions Methods
-
Lagrange Multiplier Theory (Chapter 3)
-
Necessary Conditions for Equality Constraints
-
Sufficient Conditions and Sensitivity Analysis
-
Inequality Constraints
-
Lagrange Multiplier Algorithms ( Chapter 4)
-
Barrier and Interior Point Methods
-
Penalty and Augmented Lagrangian Methods
The Quadratic Penalty Function Method
-
Exact Penalty Methods - Sequential Quadratic Programming
-
Lagrangian Methods
-
Duality and Convex Programming (Chapter 5)
-
The Dual Problem
Lagrange Multipliers, The Weak Duality Theorem
-
Strong Duality
-
Duality Methods (Chapter 6)
-
Nondifferentiable Optimization Methods
-
Lagrangian Relaxation
-
Nonlinear Programming Software
-
Homework #1 (Unconstrained Optimization)
Due: Tuesday January 21
Reading
Problems
- Bertsekas, pp 15, #1.1.6 (Weber Point)
and
solution latex file.
- Bertsekas, pp 51, #1.2.14 (Steepest Descent)
and
solution latex file1
and file2.
- Bertsekas, pp 115 #1.5.3 (Gauss-Newton);
Use the optimization
technology center to solve a similar problem but with k randomly generated
data points, k=50; if possible, discuss the solver that you used and the
algorithm.
- Bertsekas, pp 141 #1.7.2 (Limited Memory BFGS)
-
Homework #2 (Optimization Over a Convex Set)
Due: Thursday February 6
Reading
Problems
- Bertsekas, pp 187 #2.1.10 (Second Order Nec. Opt. Cond.)
- Bertsekas, pp 188 #2.1.11 (Second Order Suff. Opt. Cond.)
- Bertsekas, pp 190 #2.1.16 (Fractional Programming)
- Bertsekas, pp 223 #2.3.8 (The Proximal Minimization Algorithm)
-
Homework #3 (Lagrange Multiplier Theory)
Due: Thursday February 20
Reading
- Bertsekas, topics in Chapter 3
Problems
- Bertsekas, pp 271 #3.1.6 (The AGM Inequality)
- Bertsekas, pp 291 #3.3.6 (Minimax)
- Bertsekas, pp 309 #3.4.4 (Transportation Problem)
-
Homework #4 (Lagrange Multiplier Algorithms)
Due: Tuesday March 11
Reading
- Bertsekas, topics in Chapter 4
Problems
- Bertsekas, pp 327 #4.1.1
(Central Path - for the program, you can use OTC-NEOS)
- Bertsekas, pp 360 #4.2.8 (Ill-Conditioning of Penalty Methods)
- Bertsekas, pp 271 #4.3.4 (Exact Penalty Functions)
- Bertsekas, pp 271 #4.3.9 (Maratos' Effect)
-
Homework #5
Due: Thursday April 3
Reading
- Bertsekas, topics in Chapters 5 and 6
Problems
- Bertsekas, pp 436 #5.1.7 (Duality Gap for the Knapsack Problem)
- Bertsekas, pp 450 #5.3.1 (Boundedness of Lagrange Multipliers)
- Bertsekas, pp 530 #6.4.1 (Linear Problems with Coupling Variables)