`

Timezone: »

 
A Preliminary Study on the Feature Representations of Transfer Learning and Gradient-Based Meta-Learning Techniques
Mike Huisman · Jan van Rijn · Aske Plaat
Event URL: https://openreview.net/forum?id=bkZRxSrCMf0 »

Meta-learning has received considerable attention as one approach to enable deep neural networks to learn from a few data. Recent results suggest that simply fine-tuning a pre-trained network may be more effective at learning new image classification tasks from limited data than more complicated meta-learning techniques such as MAML. This is surprising as the learning behavior of MAML mimics that of fine-tuning. We investigate this phenomenon and show that the pre-trained features are more diverse and discriminative than those learned by MAML and Reptile, which specialize for adaptation in low-data regimes of a similar data distribution as the one used for training. Due to this specialization and lack of diversity, MAML and Reptile may fail to generalize to out-of-distribution tasks whereas fine-tuning can fall back on the diversity of the learned features.

Author Information

Mike Huisman (Leiden University)
Jan van Rijn (Columbia University)
Aske Plaat (Leiden University)

More from the Same Authors