Title: Decentralized Composite Optimization in Stochastic Networks: A Dual Averaging Approach with Linear Convergence

Zirui Zhou

Abstract:
Decentralized optimization, particularly the class of decentralized
composite convex optimization (DCCO) problems, has found many
applications. Due to ubiquitous communication congestion and random
dropouts in practice, it is highly desirable to design decentralized
algorithms that can handle stochastic communication networks. However,
most existing algorithms for DCCO only work in time-invariant networks
and cannot be extended to stochastic networks because they inherently
need knowledge of network topology a priori. In this talk, we present a
new decentralized dual averaging (DDA) algorithm that can solve DCCO in
stochastic networks. Under a rather mild condition on stochastic
networks, we show that the proposed algorithm attains global linear
convergence if each local objective function is strongly convex. Our
algorithm substantially improves the existing DDA-type decentralized
algorithms as the latter were only known to converge sublinearly prior
to our work. The key to achieving the improved rate is the design of a
novel dynamic averaging consensus protocol for DDA, which intuitively
leads to more accurate local estimates of the global dual variable. To
the best of our knowledge, this is the first linearly convergent
DDA-type decentralized algorithm and also the first algorithm that
attains global linear convergence for solving DCCO in stochastic
networks. Numerical results are also presented to support our design and
analysis. 

This is a joint work with Changxin Liu, Jian Pei, Yong Zhang, and Yang Shi.