Timezone: »

ZooD: Exploiting Model Zoo for Out-of-Distribution Generalization
Qishi Dong · Awais Muhammad · Fengwei Zhou · Chuanlong Xie · Tianyang Hu · Yongxin Yang · Sung-Ho Bae · Zhenguo Li

Wed Nov 30 09:00 AM -- 11:00 AM (PST) @ Hall J #709

Recent advances on large-scale pre-training have shown great potentials of leveraging a large set of Pre-Trained Models (PTMs) for improving Out-of-Distribution (OoD) generalization, for which the goal is to perform well on possible unseen domains after fine-tuning on multiple training domains. However, maximally exploiting a zoo of PTMs is challenging since fine-tuning all possible combinations of PTMs is computationally prohibitive while accurate selection of PTMs requires tackling the possible data distribution shift for OoD tasks. In this work, we propose ZooD, a paradigm for PTMs ranking and ensemble with feature selection. Our proposed metric ranks PTMs by quantifying inter-class discriminability and inter-domain stability of the features extracted by the PTMs in a leave-one-domain-out cross-validation manner. The top-K ranked models are then aggregated for the target OoD task. To avoid accumulating noise induced by model ensemble, we propose an efficient variational EM algorithm to select informative features. We evaluate our paradigm on a diverse model zoo consisting of 35 models for various OoD tasks and demonstrate: (i) model ranking is better correlated with fine-tuning ranking than previous methods and up to 9859x faster than brute-force fine-tuning; (ii) OoD generalization after model ensemble with feature selection outperforms the state-of-the-art methods and the accuracy on most challenging task DomainNet is improved from 46.5\% to 50.6\%. Furthermore, we provide the fine-tuning results of 35 PTMs on 7 OoD datasets, hoping to help the research of model zoo and OoD generalization. Code will be available at \href{https://gitee.com/mindspore/models/tree/master/research/cv/zood}{https://gitee.com/mindspore/models/tree/master/research/cv/zood}.

Author Information

Qishi Dong (Hong Kong Baptist University)
Awais Muhammad (Kyung-Hee University)
Fengwei Zhou (Huawei Technologies Ltd.)
Chuanlong Xie (Beijing Normal University)
Tianyang Hu (Huawei Technologies Ltd.)
Yongxin Yang (Queen Mary University of London)
Sung-Ho Bae (Kyung Hee University)
Zhenguo Li (Noah's Ark Lab, Huawei Tech Investment Co Ltd)

More from the Same Authors