Skip to yearly menu bar Skip to main content


Poster

Meta-Exploiting Frequency Prior for Cross-Domain Few-Shot Learning

Fei Zhou · Peng Wang · Lei Zhang · Zhenghua Chen · Wei Wei · Chen Ding · Guosheng Lin · Yanning Zhang

East Exhibit Hall A-C #3509
[ ]
Wed 11 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Meta-learning offers a promising avenue for few-shot learning (FSL), enabling models to glean a generalizable feature embedding through episodic training on synthetic FSL tasks in a source domain. Yet, in practical scenarios where the target task diverges from that in the source domain, meta-learning based method is susceptible to over-fitting. To overcome this, we introduce a novel framework, Meta-Exploiting Frequency Prior for Cross-Domain Few-Shot Learning, which is crafted to comprehensively exploit the cross-domain transferable image prior that each image can be decomposed into complementary low-frequency content details and high-frequency robust structural characteristics. Motivated by this insight, we propose to decompose each query image into its high-frequency and low-frequency components, and parallel incorporate them into the feature embedding network to enhance the final category prediction. More importantly, we introduce a feature reconstruction prior and a prediction consistency prior to separately encourage the consistency of the intermediate feature as well as the final category prediction between the original query image and its decomposed frequency components. This allows for collectively guiding the network's meta-learning process with the aim of learning generalizable image feature embeddings, while not introducing any extra computational cost in the inference phase. Our framework establishes new state-of-the-art results on multiple cross-domain few-shot learning benchmarks.

Live content is unavailable. Log in and register to view live content