This is the public, feature-limited version of the conference webpage. After Registration and login please visit the full version.

IDEAL: Inexact DEcentralized Accelerated Augmented Lagrangian Method

Yossi Arjevani, Joan Bruna, Bugra Can, Mert Gurbuzbalaban, Stefanie Jegelka, Hongzhou Lin

Spotlight presentation: Orals & Spotlights Track 21: Optimization
on Wed, Dec 9th, 2020 @ 15:20 – 15:30 GMT
Poster Session 4 (more posters)
on Wed, Dec 9th, 2020 @ 17:00 – 19:00 GMT
Abstract: We introduce a framework for designing primal methods under the decentralized optimization setting where local functions are smooth and strongly convex. Our approach consists of approximately solving a sequence of sub-problems induced by the accelerated augmented Lagrangian method, thereby providing a systematic way for deriving several well-known decentralized algorithms including EXTRA and SSDA. When coupled with accelerated gradient descent, our framework yields a novel primal algorithm whose convergence rate is optimal and matched by recently derived lower bounds. We provide experimental results that demonstrate the effectiveness of the proposed algorithm on highly ill-conditioned problems.

Preview Video and Chat

To see video, interact with the author and ask questions please use registration and login.