TITLE:  Two Topics on the Complexity of Convex Optimization, one 
Computational and one Theoretical

ABSTRACT:  We consider the following quite general convex optimization 
problem format: $$\begin{array}{lclr} G_d: & {\rm minimize}_x & c^{T}x
\\ & \mbox{ s.t. } & Ax=b\\ & & x \in P, \\
\end{array}$$
\noindent where $P$ is a closed convex set, not necessarily a cone.  Linear 
optimization is a special case of this format in the case when $P$ is 
polyhedral, and a modern complexity theory for linear optimization has been 
developed based on a notion of condition number $C(d)$ for the data 
$d=(A,b,c)$ for $G_d$.  We develop some computational experience and test 
the practical relevance of this condition number theory, as applied to 
problem instances that one might encounter in practice, using the NETLIB 
suite of LP problem instances.  We present empirical evidence that 
indicates that 42% of the variation in IPM iterations among the NETLIB 
suite problem instances is accounted for by log C(d) of the problem 
instances after pre-processing.

Our theoretical work concerns the complexity of convex optimization using 
geometry-based measures rather than algebraic or data-based measures for 
obtaining complexity bounds.  We bound the complexity of computing an 
almost-optimal solution of $G_d$ in terms of natural geometry-based 
measures of the feasible region and the level-set of almost-optimal 
solutions, relative to a given {\em reference point} $x^r$ that might be 
close to the feasible region and/or the almost-optimal level set.


**********************************************************************
Professor Robert M. Freund
MIT Sloan School of Management
Building E53-357
50 Memorial Drive
Cambridge, MA  02142-1347
E-mail:     rfreund@mit.edu
Telephone:  617-253-8997
Fax:        617-258-7579
http://web.mit.edu/rfreund/www
**********************************************************************