Skip to yearly menu bar Skip to main content


Oral
in
Workshop: Federated Learning: Recent Advances and New Challenges

Tackling Personalized Federated Learning with Label Concept Drift via Hierarchical Bayesian Modeling

Xingchen Ma · Junyi Zhu · Matthew Blaschko


Abstract:

Federated Learning (FL) is a distributed learning scheme to train a shared model across clients. One fundamental challenge in FL is that the sets of data across clients could be non-identically distributed, which is common in practice. Personalized Federated Learning (PFL) attempts to solve this challenge. Most methods in the literature of PFL focus on the data heterogeneity that clients differ in their label distributions. In this work, we focus on label concept drift which is a large but unexplored area. Firstly, we present a general framework for PFL based on hierarchical Bayesian inference. A global variable is introduced to capture the common trends of different clients and is used to augment the joint distribution of clients' parameters. Then we describe two concrete inference algorithms based on this framework. The first one finds a maximum a posteriori (MAP) solution for this augmented posterior distribution and adds little overhead compared with existing approaches. The second one further considers uncertainties of clients' parameters and different drift pattern across clients.We demonstrate our methods through extensive empirical studies on CIFAR100 and SUN397. Experimental results show our approach significantly outperforms the state of the art PFL when tackling the label concept drift across clients.

Chat is not available.