Towards Optimal Communication Complexity in Distributed Non-Convex Optimization

Kumar Kshitij Patel · Lingxiao Wang · Blake Woodworth · Brian Bullins · Nati Srebro

Hall J #936

Keywords: [ lower bounds ] [ Intermittent Communication Setting ] [ federated learning ] [ Non-Convex Optimization ] [ Stochastic Optimization ] [ Distributed Optimization ]

[ Abstract ]
[ Paper [ Poster [ OpenReview
Thu 1 Dec 9 a.m. PST — 11 a.m. PST

Abstract: We study the problem of distributed stochastic non-convex optimization with intermittent communication. We consider the full participation setting where $M$ machines work in parallel over $R$ communication rounds and the partial participation setting where $M$ machines are sampled independently every round from some meta-distribution over machines. We propose and analyze a new algorithm that improves existing methods by requiring fewer and lighter variance reduction operations. We also present lower bounds, showing our algorithm is either $\textit{optimal}$ or $\textit{almost optimal}$ in most settings. Numerical experiments demonstrate the superior performance of our algorithm.

Chat is not available.