Timezone: »

 
Poster
A Unified Analysis of Federated Learning with Arbitrary Client Participation
Shiqiang Wang · Mingyue Ji

Wed Nov 30 02:00 PM -- 04:00 PM (PST) @ Hall J #307

Federated learning (FL) faces challenges of intermittent client availability and computation/communication efficiency. As a result, only a small subset of clients can participate in FL at a given time. It is important to understand how partial client participation affects convergence, but most existing works have either considered idealized participation patterns or obtained results with non-zero optimality error for generic patterns. In this paper, we provide a unified convergence analysis for FL with arbitrary client participation. We first introduce a generalized version of federated averaging (FedAvg) that amplifies parameter updates at an interval of multiple FL rounds. Then, we present a novel analysis that captures the effect of client participation in a single term. By analyzing this term, we obtain convergence upper bounds for a wide range of participation patterns, including both non-stochastic and stochastic cases, which match either the lower bound of stochastic gradient descent (SGD) or the state-of-the-art results in specific settings. We also discuss various insights, recommendations, and experimental results.

Author Information

Shiqiang Wang (IBM Research)
Mingyue Ji (University of Utah)

More from the Same Authors