Class 6, C&O 466/666
-
Line Search Methods continued ...(Chapter 3)
-
Rate of Convergence
-
Theorems 3.3,3.4 (linear convergence rate for steepest descent -
dependence on condition number of Hessian)
-
Theorem 3.5 (Dennis-More condition - quasi-Newton method Hessian
approximate needs (asymptotic) convergence in search direction for
superlinear convergence)
-
Theorem 3.7 (with proof! - quadratic convergence of Newton's method near
optimum.)
Reference (for your interest only):
CMP 1 789 601 (2001:03) 90C30 (65K05),
Shi, Yixun(1-BLBS-CS),
Globally convergent algorithms for unconstrained optimization. (English.
English summary)
Comput. Optim. Appl. 16 (2000), no. 3, 295--308