Title: Restarting Accelerated First Order Methods: Guaranteed Bounds and
Applications

Abstract:
 First order methods like accelerated proximal gradient (APG) have become
exceedingly popular to solve convex composite problems as variable sizes
grow to billions. Restarting algorithms has been seen to improve performance
of many algorithms. O’Donoghue and Candès introduced adaptive restart
strategies for APG which rely on easy to compute conditions and indicate a
large performance boost in terms of convergence of function value. The
strategies in general are a heuristic, and there is no proof of convergence.
We consider the case of one-dimensional functions where we prove that the
gradient based restart strategy from O’Donoghue and Candès improves the
classic O(1/k^2) bound of APG. We also study the restarting scheme applied
to the method of alternating projections (MAP) for two closed, nonempty,
convex sets. MAP can be cast into the composite optimization problem which
makes it suitable for acceleration. We study the case of two hyperplanes in
any dimension. Furthermore we make observations as to why the restarts help,
what makes a good restart condition, as well as what is needed to make
progress in the general case.