Syllabus for
This course provides an introductory treatment of topics in
Nonlinear Optimization that includes a
hands-on approach with exposure to user friendly existing software packages.
(see CVX)
We cover the principles of nonlinear continuous optimization, that is,
minimizing an objective function that depends nonlinear and continuously
on unknown variables that satisfy constraints. Convex optimization will
be introduced. Applications to data-mining and machine learning
("data science") will also be introduced.
Lectures start Thursday Sept. 8 and end Tuesday Dec. 6, 2022.
Midterm EXAM: TBA; in TBA
Final EXAM: TBA
-
Introduction to Nonlinear Optimization
-
Notation and general formalism
-
Preliminary calculus and linear algebra results
-
Unconstrained Optimization
-
Optimality conditions
- Coercivity and existence of a minimizer
- Quadratic functions
- (non)linear least squares
-
Algorithms for Unconstrained Optimization
-
descent and conjugate gradient methods
-
Newton (quasi-Newton) and trust region type methods
-
Convex Sets and Functions
- Geometry
- Separation theorems
-
Constrained Optimization
- Characterizations of optimality
-
Algorithms
-
Applications in Machine Learning and Big Data
Links
,