Timezone: »
In decentralized optimization, nodes of a communication network each possess a local objective function, and communicate using gossip-based methods in order to minimize the average of these per-node functions. While synchronous algorithms are heavily impacted by a few slow nodes or edges in the graph (the \emph{straggler problem}), their asynchronous counterparts are notoriously harder to parametrize. Indeed, their convergence properties for networks with heterogeneous communication and computation delays have defied analysis so far. In this paper, we use a \emph{ continuized} framework to analyze asynchronous algorithms in networks with delays. Our approach yields a precise characterization of convergence time and of its dependency on heterogeneous delays in the network. Our continuized framework benefits from the best of both continuous and discrete worlds: the algorithms it applies to are based on event-driven updates. They are thus essentially discrete and hence readily implementable. Yet their analysis is essentially in continuous time, relying in part on the theory of delayed ODEs. Our algorithms moreover achieve an \emph{asynchronous speedup}: their rate of convergence is controlled by the eigengap of the network graph weighted by local delays, instead of the network-wide worst-case delay as in previous analyses. Our methods thus enjoy improved robustness to stragglers.
Author Information
Mathieu Even (INRIA)
Hadrien Hendrikx (EPFL)
Laurent Massoulié (Inria)
More from the Same Authors
-
2022 Poster: Asynchronous SGD Beats Minibatch SGD Under Arbitrary Delays »
Konstantin Mishchenko · Francis Bach · Mathieu Even · Blake Woodworth -
2022 Poster: On Sample Optimality in Personalized Collaborative and Federated Learning »
Mathieu Even · Laurent Massoulié · Kevin Scaman -
2022 Poster: Muffliato: Peer-to-Peer Privacy Amplification for Decentralized Optimization and Averaging »
Edwige Cyffers · Mathieu Even · Aurélien Bellet · Laurent Massoulié -
2021 Oral: Continuized Accelerations of Deterministic and Stochastic Gradient Descents, and of Gossip Algorithms »
Mathieu Even · Raphaël Berthier · Francis Bach · Nicolas Flammarion · Hadrien Hendrikx · Pierre Gaillard · Laurent Massoulié · Adrien Taylor -
2021 Poster: Continuized Accelerations of Deterministic and Stochastic Gradient Descents, and of Gossip Algorithms »
Mathieu Even · Raphaël Berthier · Francis Bach · Nicolas Flammarion · Hadrien Hendrikx · Pierre Gaillard · Laurent Massoulié · Adrien Taylor -
2020 Poster: Dual-Free Stochastic Decentralized Optimization with Variance Reduction »
Hadrien Hendrikx · Francis Bach · Laurent Massoulié -
2019 Poster: An Accelerated Decentralized Stochastic Proximal Algorithm for Finite Sums »
Hadrien Hendrikx · Francis Bach · Laurent Massoulié -
2018 Poster: Optimal Algorithms for Non-Smooth Distributed Optimization in Networks »
Kevin Scaman · Francis Bach · Sebastien Bubeck · Laurent Massoulié · Yin Tat Lee -
2018 Oral: Optimal Algorithms for Non-Smooth Distributed Optimization in Networks »
Kevin Scaman · Francis Bach · Sebastien Bubeck · Laurent Massoulié · Yin Tat Lee -
2017 : Dynamic Safe Interruptibility for Decentralized Multi-Agent Reinforcement Learning »
Hadrien Hendrikx -
2017 Poster: Dynamic Safe Interruptibility for Decentralized Multi-Agent Reinforcement Learning »
El Mahdi El-Mhamdi · Rachid Guerraoui · Hadrien Hendrikx · Alexandre Maurer -
2017 Spotlight: Dynamic Safe Interruptibility for Decentralized Multi-Agent Reinforcement Learning »
El Mahdi El-Mhamdi · Rachid Guerraoui · Hadrien Hendrikx · Alexandre Maurer