Timezone: »

 
Efficient Federated Random Subnetwork Training
Francesco Pase · Berivan Isik · Deniz Gunduz · Tsachy Weissman · Michele Zorzi

Fri Dec 02 08:30 AM -- 08:37 AM (PST) @
Event URL: https://openreview.net/forum?id=YZIVv_37y2z »
One main challenge in federated learning is the large communication cost of exchanging weight updates from clients to the server at each round. While prior work has made great progress in compressing the weight updates through gradient compression methods, we propose a radically different approach that does not update the weights. Instead, our method freezes the weights at their initial random values and learns how to sparsify the random network for the best performance. To this end, the clients collaborate in training a \emph{stochastic} binary mask to find the optimal random sparse network within the original one. At the end of the training, the final model is a randomly weighted sparse network -- or a subnetwork inside the random dense network. We show improvements in accuracy, communication bitrate (less than $1$ bit per parameter (bpp)), convergence speed, and final model size (less than $1$ bpp) over relevant baselines on MNIST, EMNIST, CIFAR-10, and CIFAR-100 datasets, in the low bitrate regime under various system configurations.

Author Information

Francesco Pase (University of Padova)
Berivan Isik (Stanford University)
Berivan Isik

I am a fourth-year PhD student in Electrical Engineering Department at Stanford University, advised by Tsachy Weissman. My research interests are machine learning, information theory, and data compression. Recently, I have been working on model compression, federated learning, learned data compression, and compression for privacy, robustness and fairness in machine learning. My research is supported by Stanford Graduate Fellowship (2019-2023). I received my MS degree from Stanford University in June 2021 and my BS degree from Middle East Technical University in June 2019, both in Electrical Engineering. Previously, I interned at Stanford in 2018 summer as an undergraduate researcher under the supervision of Ayfer Ozgur. In 2021 summer, I worked at Google as a research intern hosted by Philip Chou. In February 2022, I returned to Google as a student researcher and worked on learned video compression until October 2022. I have been working at Amazon as an applied scientist intern since October 2022.

Deniz Gunduz (Imperial College London)
Tsachy Weissman (Stanford University)
Michele Zorzi (University of Padua)

More from the Same Authors