Poster
Near-Optimal Distributed Minimax Optimization under the Second-Order Similarity
Qihao Zhou · Haishan Ye · Luo Luo
West Ballroom A-D #6110
Abstract:
This paper considers the distributed convex-concave minimax optimization under the second-order similarity.We propose stochastic variance-reduced optimistic gradient sliding (SVOGS) method, which takes the advantage of the finite-sum structure in the objective by involving the mini-batch client sampling and variance reduction.We prove SVOGS can achieve the -duality gap within communication rounds of , communication complexity of ,and local gradient calls of , where is the number of nodes, is the degree of the second-order similarity, is the smoothness parameter and is the diameter of the constraint set.We can verify that all of above complexity (nearly) matches the corresponding lower bounds.For the specific -strongly-convex--strongly-convex case, our algorithm has the upper bounds on communication rounds, communication complexity, and local gradient calls of , , and respectively, which are also nearly tight.Furthermore, we conduct the numerical experiments to show the empirical advantages of proposed method.
Chat is not available.