Taoli Zheng

Title: Doubly Smoothed Optimistic Gradients: A Universal Recipe for Smooth
Minimax Problems

Abstract: 
Smooth minimax optimization has widespread applications in machine learning
and operations research. However, existing algorithmic frameworks for convex
and nonconvex minimax optimization differ fundamentally, let alone for other
structural properties such as weak Minty-type conditions and
Kurdyka-\L{}ojasiewicz (K\L{}) properties. In this work, we introduce a
universal recipe to solve a broad class of smooth minimax optimization
problems, including convex-concave, nonconvex-concave, convex-nonconcave,
nonconvex-K\L{}, and K\L{}-nonconcave cases. The newly developed doubly
smoothed optimistic gradient descent ascent method (DS-OGDA) is universally
applicable across these scenarios with a single set of parameters,
eliminating the need for prior structural information to determine the step
size. Furthermore, with additional information, DS-OGDA can achieve optimal
or best-known results for each scenario.