Timezone: »
Federated Learning (FL) has been gaining significant traction across different ML tasks, ranging from vision to keyboard predictions. In large-scale deployments, client heterogeneity is a fact and constitutes a primary problem for fairness, training performance and accuracy. Although significant efforts have been made into tackling statistical data heterogeneity, the diversity in the processing capabilities and network bandwidth of clients, termed system heterogeneity, has remained largely unexplored. Current solutions either disregard a large portion of available devices or set a uniform limit on the model's capacity, restricted by the least capable participants.In this work, we introduce Ordered Dropout, a mechanism that achieves an ordered, nested representation of knowledge in Neural Networks and enables the extraction of lower footprint submodels without the need for retraining. We further show that for linear maps our Ordered Dropout is equivalent to SVD. We employ this technique, along with a self-distillation methodology, in the realm of FL in a framework called FjORD. FjORD alleviates the problem of client system heterogeneity by tailoring the model width to the client's capabilities. Extensive evaluation on both CNNs and RNNs across diverse modalities shows that FjORD consistently leads to significant performance gains over state-of-the-art baselines while maintaining its nested structure.
Author Information
Samuel Horváth (King Abdullah University of Science and Technology)
Stefanos Laskaridis (Samsung AI Center Cambridge)
Mario Almeida (Samsung)
Ilias Leontiadis
Stylianos Venieris (Samsung AI)
Nicholas Lane (Samsung AI Center Cambridge & University of Oxford)
Related Events (a corresponding poster, oral, or spotlight)
-
2021 Poster: FjORD: Fair and Accurate Federated Learning under heterogeneous targets with Ordered Dropout »
Thu. Dec 9th 04:30 -- 06:00 PM Room
More from the Same Authors
-
2020 : Optimal Client Sampling for Federated Learning »
Samuel Horváth -
2021 : A Channel Coding Benchmark for Meta-Learning »
Rui Li · Ondrej Bohdal · Rajesh K Mishra · Hyeji Kim · Da Li · Nicholas Lane · Timothy Hospedales -
2021 : FedMix: A Simple and Communication-Efficient Alternative to Local Methods in Federated Learning »
Elnur Gasanov · Ahmed Khaled Ragab Bayoumi · Samuel Horváth · Peter Richtarik -
2021 : FedMix: A Simple and Communication-Efficient Alternative to Local Methods in Federated Learning »
Elnur Gasanov · Ahmed Khaled Ragab Bayoumi · Samuel Horváth · Peter Richtarik -
2022 : Cross-device Federated Architecture Search »
Stefanos Laskaridis · Javier Fernandez-Marques · Łukasz Dudziak -
2020 : Contributed talks in Session 2 (Zoom) »
Martin Takac · Samuel Horváth · Guan-Horng Liu · Nicolas Loizou · Sharan Vaswani -
2020 : Contributed Video: Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization, Samuel Horvath »
Samuel Horváth -
2020 : Poster Session 1 (gather.town) »
Laurent Condat · Tiffany Vlaar · Ohad Shamir · Mohammadi Zaki · Zhize Li · Guan-Horng Liu · Samuel Horváth · Mher Safaryan · Yoni Choukroun · Kumar Shridhar · Nabil Kahale · Jikai Jin · Pratik Kumar Jawanpuria · Gaurav Kumar Yadav · Kazuki Koyama · Junyoung Kim · Xiao Li · Saugata Purkayastha · Adil Salim · Dighanchal Banerjee · Peter Richtarik · Lakshman Mahto · Tian Ye · Bamdev Mishra · Huikang Liu · Jiajie Zhu -
2020 Poster: BRP-NAS: Prediction-based NAS using GCNs »
Lukasz Dudziak · Thomas Chau · Mohamed Abdelfattah · Royson Lee · Hyeji Kim · Nicholas Lane -
2020 Poster: Lower Bounds and Optimal Algorithms for Personalized Federated Learning »
Filip Hanzely · Slavomír Hanzely · Samuel Horváth · Peter Richtarik