You can use MATLAB on many machines on campus.
An online course,
which works with pretty well any browser that supports the Flash
plug-in on Windows, Linux, Solaris, Mac OS is available at:
http://saw.uwaterloo.ca/matlab Click: "MATLAB Mastery I" to start.
Problems
Go to the
NEOS Solvers
web page. Use the unconstrained optimization package NMTR
to solve the extended Rosenbrock function
(see page 191 in the text, edition 2)
with n=48 variables, parameter values alpha=20 and
alpha=350.
It seems that NMTR does not have a sample anymore at
NEOS. Please Use the MATLAB solver instead if NMTR does not work for
you. You can use the MATLAB function/command: fminunc.
Hand in (or email in) a copy of the outputs you get by email.
In each case, point out the solution and the optimal value.
You will see a list of the optimization functions that are available
with the optimization toolkit.
Now try the command
demo toolbox optim
Then choose: Minimization of the Banana Function.
bandem
to test out several unconstrained algorithms on the banana
function or Rosenbrock's function. This is a classical example of a
function with a narrow valley that exhibits ``very'' slow convergence
for steepest descent type methods.
How many function evaluations
did the minimization take for: steepest descent; simplex
search; Broyden-Fletcher-Goldfarb-Shanno; Davidon-Fletcher-Powell;
Levenberg-Marquardt?
Link to further problems
(Hint: Please note another equivalent characterization for strongly
convex functions involves the first order characterization:
f(y) g.e. f(x) + grad f(x)T(y-x) + (m/2)||y-x||2
Also, there was a typo in the question, i.e. the function is a strongly
convex quadratic function.
Homework #3
(CG, NLLS)
Due: Wed, February 8, 2012 (at start of class)
Reading
Consider the
MATLAB file cgtest.m. This file generates a linear system Ax=b where
A is positive definite with only 3 distinct eigenvalues. It is then
solved using a variant of the CG strategy.
Run the file to generate a random problem and verify that you have the
solution. Save the data in a mat file and send in by email.
Bonus 1:
Can you prove that you always get convergence to the solution of Ax=b?
(Hint: Try it on a two dimensional example; draw the eigenvectors.
Also, consider the convergence of steepest descent when eigenvectors are
involved.)
And, in what order should the eigenvalues be used so that a best
approximation is obtained at each iteration.
Do Problem 5.1 in the text to implement Algorithm 5.2.
Bonus 2: Then try the same with the preconditioned CG Algorithm 5.3.
What would a reasonable preconditioner be? The diagonal elements? Or the
diagonal with the sup and sub diagonals?
Test out the data fitting demo (medium-scale nonlinear data
fitting,
fitdemo).
This is one of the demos in the
optimization toolkit.
The example demonstrates fitting a nonlinear function to a set of data.
Provide the plots of the data points; then provide the plots, number of
iterations, function evaluations, and norm of the residual of the fit when using
the Gauss-Newton method (with mixed polynomial interpolation) and, also when using
unconstrained optimization.
( See the
Least Squares (Model Fitting) Algorithms documentation in
MATLAB.)
Include the algebraic form of the nonlinear function that is used.