Skip to yearly menu bar Skip to main content


Oral
in
Workshop: Workshop on Federated Learning in the Age of Foundation Models in Conjunction with NeurIPS 2023 (FL@FM-NeurIPS'23)

FedSoL: Bridging Global Alignment and Local Generality in Federated Learning

Gihun Lee · Minchan Jeong · SangMook Kim · Jaehoon Oh · Se-Young Yun

Keywords: [ federated learning ] [ Sharpness-Aware Minimization ] [ proximal loss ]

[ ] [ Project Page ]
Sat 16 Dec 8:55 a.m. PST — 9:05 a.m. PST

Abstract:

While FL enables learning a model with data privacy, it often suffers from significant performance degradation when client data distributions are heterogeneous. Many previous FL algorithms have addressed this issue by introducing various proximal restrictions. These restrictions aim to encourage global alignment by constraining the deviation of local learning from the global objective. However, they inherently limit local learning by interfering with the original local objectives. Recently, an alternative approach has emerged to improve local learning generality. By obtaining local models within a smooth loss landscape, this approach mitigates conflicts among different local objectives of the clients. Yet, it does not ensure stable global alignment, as local learning does not take the global objective into account. In this study, we propose Federated Stability on Learning (FedSoL), which combines both the concepts of global alignment and local generality. In FedSoL, the local learning seeks a parameter region robust against proximal perturbations. This strategy introduces an implicit proximal restriction effect in local learning while maintaining the original local objective for parameter update.

Chat is not available.