Skip to yearly menu bar Skip to main content


Poster

Towards Optimal Communication Complexity in Distributed Non-Convex Optimization

Kumar Kshitij Patel · Lingxiao Wang · Blake Woodworth · Brian Bullins · Nati Srebro

Hall J (level 1) #936

Keywords: [ lower bounds ] [ Intermittent Communication Setting ] [ federated learning ] [ Non-Convex Optimization ] [ Stochastic Optimization ] [ Distributed Optimization ]


Abstract: We study the problem of distributed stochastic non-convex optimization with intermittent communication. We consider the full participation setting where $M$ machines work in parallel over $R$ communication rounds and the partial participation setting where $M$ machines are sampled independently every round from some meta-distribution over machines. We propose and analyze a new algorithm that improves existing methods by requiring fewer and lighter variance reduction operations. We also present lower bounds, showing our algorithm is either $\textit{optimal}$ or $\textit{almost optimal}$ in most settings. Numerical experiments demonstrate the superior performance of our algorithm.

Chat is not available.