Timezone: »
In this talk, we consider the problem of training a machine learning model with distributed differential privacy (DP) where secure aggregation (SecAgg) is used to ensure that the server only sees the noisy sum of model updates in every training round. Taking into account the linearity constraints imposed by SecAgg, we characterize the optimal communication cost required to obtain the best accuracy achievable under central DP (i.e. under a fully trusted server and no communication constraints), and we derive a simple and efficient scheme that achieves the optimal bandwidth. We evaluate the optimal scheme on real-world federated learning tasks to show that we can reduce the communication cost to under 1.78 bits per parameter in realistic privacy settings without decreasing test-time performance. We conclude the talk with a few important and non-trivial open research directions.
Author Information
Peter Kairouz (Google)
More from the Same Authors
-
2021 : Communication Efficient Federated Learning with Secure Aggregation and Differential Privacy »
Wei-Ning Chen · Christopher Choquette-Choo · Peter Kairouz -
2021 Poster: Pointwise Bounds for Distribution Estimation under Communication Constraints »
Wei-Ning Chen · Peter Kairouz · Ayfer Ozgur -
2021 Poster: The Skellam Mechanism for Differentially Private Federated Learning »
Naman Agarwal · Peter Kairouz · Ken Liu -
2020 Tutorial: (Track1) Federated Learning and Analytics: Industry Meets Academia Q&A »
Peter Kairouz · Brendan McMahan · Virginia Smith