`

Timezone: »

 
Poster
A Unified View of Label Shift Estimation
Saurabh Garg · Yifan Wu · Sivaraman Balakrishnan · Zachary Lipton

Thu Dec 10 09:00 AM -- 11:00 AM (PST) @ Poster Session 5 #1535
Under label shift, the label distribution $p(y)$ might change but the class-conditional distributions $p(x|y)$ do not. There are two dominant approaches for estimating the label marginal. BBSE, a moment-matching approach based on confusion matrices, is provably consistent and provides interpretable error bounds. However, a maximum likelihood estimation approach, which we call MLLS, dominates empirically. In this paper, we present a unified view of the two methods and the first theoretical characterization of MLLS. Our contributions include (i) consistency conditions for MLLS, which include calibration of the classifier and a confusion matrix invertibility condition that BBSE also requires; (ii) a unified framework, casting BBSE as roughly equivalent to MLLS for a particular choice of calibration method; and (iii) a decomposition of MLLS's finite-sample error into terms reflecting miscalibration and estimation error. Our analysis attributes BBSE's statistical inefficiency to a loss of information due to coarse calibration. Experiments on synthetic data, MNIST, and CIFAR10 support our findings.

Author Information

Saurabh Garg (CMU)
Yifan Wu (Carnegie Mellon University)
Sivaraman Balakrishnan (CMU)
Zachary Lipton (Carnegie Mellon University)

More from the Same Authors