Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Workshop on Federated Learning in the Age of Foundation Models in Conjunction with NeurIPS 2023 (FL@FM-NeurIPS'23)

Fed3R: Recursive Ridge Regression for Federated Learning with strong pre-trained models

Eros Fanì · Raffaello Camoriano · Barbara Caputo · Marco Ciccone

Keywords: [ pre-trained models ] [ Ridge Regression ] [ random features ] [ Destructive Interference ] [ federated learning ] [ Statistical Heterogeneity ] [ Client Drift ]


Abstract:

Federated Learning offers a powerful solution for training models on data that cannot be centrally stored due to privacy concerns. However, the existing paradigm suffers from high statistical heterogeneity across clients' data, resulting in client drift due to biased local solutions. This issue is particularly pronounced in the final classifier layer, severely impeding convergence speed during aggregation. To overcome these challenges, we introduce Federated Recursive Ridge Regression (Fed3R). This approach replaces the gradient-based classifier with a ridge regression-based classifier, computed in a closed form, ensuring client drift resilience and severely reducing convergence time and communication costs. The incremental formulation of Fed3R is equivalent to the ideal centralized ridge regression solution, enabling the utilization of more complex architectures with pre-trained parameters and robust generalization capabilities incompatible with previous federated learning techniques. We propose Fed3R in three variants, with Fed3R-RF significantly enhancing performance to levels akin to centralized training while remaining competitive regarding the total communication costs.

Chat is not available.