Skip to yearly menu bar Skip to main content


Poster
in
Workshop: New Frontiers in Federated Learning: Privacy, Fairness, Robustness, Personalization and Data Ownership

Efficient and Private Federated Learning with Partially Trainable Networks

Hakim Sidahmed · Zheng Xu · Yuan Cao


Abstract:

Federated learning is used for decentralized training of machine learning models on a large number (millions) of edge mobile devices. It is challenging because mobile devices usually often have limited communication bandwidth, and local computation resources. Therefore, how to improve the efficiency of federated learning is critical for scalability and usability. In this paper, we propose to leverage partially trainable neural networks, which freeze a portion of the model parameters during the entire training process, to reduce the communication cost with little implications on model performance. Through extensive experiments, we empirically show that Federated learning of Partially Trainable neural networks (FedPT) can result in good communication-accuracy trade-offs, with up to 46x reduction in communication cost, at a small accuracy cost. Our approach also enables faster training, with a smaller memory footprint, and higher resilience to strong privacy guarantees. The proposed FedPT can be particularly interesting for pushing the limitations of overparameterization in on-device learning.

Chat is not available.