Timezone: »

Is Importance Weighting Incompatible with Interpolating Classifiers?
Ke Alexander Wang · Niladri Chatterji · Saminul Haque · Tatsunori Hashimoto
Event URL: https://openreview.net/forum?id=pEhpLxVsd03 »

Importance weighting is a classic technique to handle distribution shifts. However, prior work has presented strong empirical and theoretical evidence demonstrating that importance weights can have little to no effect on overparameterized neural networks. \emph{Is importance weighting truly incompatible with the training of overparameterized neural networks?} Our paper answers this in the negative. We show that importance weighting fails not because of the overparameterization, but instead, as a result of using exponentially-tailed losses like the logistic or cross-entropy loss. As a remedy, we show that polynomially-tailed losses restore the effects of importance reweighting in correcting distribution shift in overparameterized models. We characterize the behavior of gradient descent on importance weighted polynomially-tailed losses with overparameterized linear models, and theoretically demonstrate the advantage of using polynomially-tailed losses in a label shift setting. Surprisingly, our theory shows that using weights that are obtained by exponentiating the classical unbiased importance weights can improve performance. Finally, we demonstrate the practical value of our analysis with neural network experiments on a subpopulation shift and a label shift dataset. Our polynomially-tailed loss consistently increases the test accuracy by 2-3%.

Author Information

Ke Alexander Wang (Stanford University)
Niladri Chatterji (Stanford University)
Saminul Haque (University of Toronto)
Tatsunori Hashimoto (Stanford)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors