Timezone: »
We focus on dropout techniques for asynchronous distributed computations in federated learning (FL) scenarios. We propose \texttt{AsyncDrop}, a novel asynchronous FL framework with smart (i.e., informed/structured) dropout that achieves better performance compared to state of the art asynchronous methodologies, while resulting in less communication and training time costs. The key idea revolves around sub-models out of the global model, that take into account the device heterogeneity. We conjecture that such an approach can be theoretically justified. We implement our approach and compare it against other asynchronous baseline methods, by adapting current synchronous FL algorithms to asynchronous scenarios. Empirically, \texttt{AsyncDrop} significantly reduces the communication cost and training time, while improving the final test accuracy in non-i.i.d. scenarios.
Author Information
Chen Dun (Rice University)
Mirian Hipolito Garcia (Research, Microsoft)
Mirian Hipólito holds a Bachelor’s degree in Digital Systems and Robotics Engineering from Tecnológico de Monterrey in 2020. She previously worked in the automotive sector, gaining experience in the fields of autonomous vehicles, robotics and automation. She currently works as a Research Software engineer in MSR as part of Privacy in Artificial Intelligence Team.
Dimitrios Dimitriadis (Microsoft Research)
Christopher Jermaine (William Marsh Rice University)
Anastasios Kyrillidis (Rice University)
More from the Same Authors
-
2021 : Acceleration and Stability of the Stochastic Proximal Point Algorithm »
Junhyung Lyle Kim · Panos Toulis · Anastasios Kyrillidis -
2021 : Acceleration and Stability of the Stochastic Proximal Point Algorithm »
Junhyung Lyle Kim · Panos Toulis · Anastasios Kyrillidis -
2022 : LOFT: Finding Lottery Tickets through Filter-wise Training »
Qihan Wang · Chen Dun · Fangshuo Liao · Christopher Jermaine · Anastasios Kyrillidis -
2022 : Strong Lottery Ticket Hypothesis with $\epsilon$–perturbation »
Fangshuo Liao · Zheyang Xiong · Anastasios Kyrillidis -
2022 : Strong Lottery Ticket Hypothesis with $\epsilon$–perturbation »
Fangshuo Liao · Zheyang Xiong · Anastasios Kyrillidis -
2022 : FLUTE: A Scalable, Extensible Framework for High-Performance Federated Learning Simulations »
Mirian Hipolito Garcia · Andre Manoel · Daniel Madrigal · Robert Sim · Dimitrios Dimitriadis -
2022 : GIST: Distributed Training for Large-Scale Graph Convolutional Networks »
Cameron Wolfe · Jingkang Yang · Fangshuo Liao · Arindam Chowdhury · Chen Dun · Artun Bayer · Santiago Segarra · Anastasios Kyrillidis -
2019 : Final remarks »
Anastasios Kyrillidis · Albert Berahas · Fred Roosta · Michael Mahoney -
2019 Workshop: Beyond first order methods in machine learning systems »
Anastasios Kyrillidis · Albert Berahas · Fred Roosta · Michael Mahoney -
2019 : Opening Remarks »
Anastasios Kyrillidis · Albert Berahas · Fred Roosta · Michael Mahoney -
2019 Poster: Learning Sparse Distributions using Iterative Hard Thresholding »
Jacky Zhang · Rajiv Khanna · Anastasios Kyrillidis · Sanmi Koyejo