Skip to yearly menu bar Skip to main content


Poster

Near-Optimal Distributed Minimax Optimization under the Second-Order Similarity

Qihao Zhou · Haishan Ye · Luo Luo

West Ballroom A-D #6110
[ ]
[ Paper [ Slides [ Poster [ OpenReview
Wed 11 Dec 11 a.m. PST — 2 p.m. PST

Abstract: This paper considers the distributed convex-concave minimax optimization under the second-order similarity.We propose stochastic variance-reduced optimistic gradient sliding (SVOGS) method, which takes the advantage of the finite-sum structure in the objective by involving the mini-batch client sampling and variance reduction.We prove SVOGS can achieve the ε-duality gap within communication rounds of O(δD2/ε), communication complexity of O(n+nδD2/ε),and local gradient calls of O~(n+(nδ+L)D2/εlog(1/ε)), where n is the number of nodes, δ is the degree of the second-order similarity, L is the smoothness parameter and D is the diameter of the constraint set.We can verify that all of above complexity (nearly) matches the corresponding lower bounds.For the specific μ-strongly-convex-μ-strongly-convex case, our algorithm has the upper bounds on communication rounds, communication complexity, and local gradient calls of O(δ/μlog(1/ε)), O((n+nδ/μ)log(1/ε)), and O~(n+(nδ+L)/μ)log(1/ε)) respectively, which are also nearly tight.Furthermore, we conduct the numerical experiments to show the empirical advantages of proposed method.

Chat is not available.