Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Workshop on Distribution Shifts: Connecting Methods and Applications

Invariant Feature Subspace Recovery for Multi-Class Classification

Gargi Balasubramaniam · Haoxiang Wang · Han Zhao


Abstract: Domain generalization aims to learn a model over multiple training environments to generalize to unseen environments. Recently, Wang et al [2022] proposed Invariant-feature Subspace Recovery (ISR), a domain generalization algorithm which uses the means of class-conditional data distributions to provably identify the invariant-feature subspace. However, the original ISR algorithm is conditioned on single class only, without utilizing information from the rest classes. In this work, we consider the setting of multi-class classification, and propose an extension of the ISR algorithm, called ISR-Multiclass. This proposed algorithm can provably recover the invariant-feature subspace with $\mathcal{O}(d_{spu}/k) + 1$ environments, where $d_{spu}$ is the number of spurious features and $k$ is the number of classes. Empirically, we first examine ISR-Multiclass in a synthetic dataset, and demonstrate its superiority over the original ISR in the multi-class setting. Furthermore, we conduct experiments in Multiclass Coloured MNIST, a semi-synthetic dataset with strong spurious correlations, and show that ISR-Multiclass can significantly improve the robustness of neural nets trained by various methods (e.g., ERM and IRM) against spurious correlations.

Chat is not available.