`

Timezone: »

 
Poster
No Fear of Heterogeneity: Classifier Calibration for Federated Learning with Non-IID Data
Mi Luo · Fei Chen · Dapeng Hu · Yifan Zhang · Tim Liang · Jiashi Feng

Thu Dec 09 12:30 AM -- 02:00 AM (PST) @ None #None

A central challenge in training classification models in the real-world federated system is learning with non-IID data. To cope with this, most of the existing works involve enforcing regularization in local optimization or improving the model aggregation scheme at the server. Other works also share public datasets or synthesized samples to supplement the training of under-represented classes or introduce a certain level of personalization. Though effective, they lack a deep understanding of how the data heterogeneity affects each layer of a deep classification model. In this paper, we bridge this gap by performing an experimental analysis of the representations learned by different layers. Our observations are surprising: (1) there exists a greater bias in the classifier than other layers, and (2) the classification performance can be significantly improved by post-calibrating the classifier after federated training. Motivated by the above findings, we propose a novel and simple algorithm called Classifier Calibration with Virtual Representations (CCVR), which adjusts the classifier using virtual representations sampled from an approximated gaussian mixture model. Experimental results demonstrate that CCVR achieves state-of-the-art performance on popular federated learning benchmarks including CIFAR-10, CIFAR-100, and CINIC-10. We hope that our simple yet effective method can shed some light on the future research of federated learning with non-IID data.

Author Information

Mi Luo (National University of Singapore)
Fei Chen (Huawei Noah's Ark Lab)
Dapeng Hu (National University of Singapore)
Yifan Zhang (National University of Singapore)
Tim Liang (CASIA)
Jiashi Feng (UC Berkeley)

More from the Same Authors