Timezone: »

To Federate or Not To Federate: Incentivizing Client Participation in Federated Learning
Yae Jee Cho · Divyansh Jhunjhunwala · Tian Li · Virginia Smith · Gauri Joshi

Fri Dec 02 08:50 AM -- 08:57 AM (PST) @
Event URL: https://openreview.net/forum?id=pG08eM0CQba »
Federated learning (FL) facilitates collaboration between a group of clients who seek to train a common machine learning model without directly sharing their local data. Although there is an abundance of research on improving the speed, efficiency, and accuracy of federated training, most works implicitly assume that all clients are willing to participate in the FL framework. Due to data heterogeneity, however, the global model may not work well for some clients, and they may instead choose to use their own local model. Such disincentivization of clients can be problematic from the server's perspective because having more participating clients yields a better global model, and offers better privacy guarantees to the participating clients. In this paper, we propose an algorithm called IncFL that explicitly maximizes the fraction of clients who are incentivized to use the global model by dynamically adjusting the aggregation weights assigned to their updates. Our experiments show that IncFL increases the number of incentivized clients by $30$-$55\%$ compared to standard federated training algorithms, and can also improve the generalization performance of the global model on unseen clients.

Author Information

Yae Jee Cho (Carnegie Mellon University)
Divyansh Jhunjhunwala (Carnegie Mellon University)
Tian Li (CMU)
Virginia Smith (Carnegie Mellon University)
Gauri Joshi (Carnegie Mellon University)

More from the Same Authors