Timezone: »
Meta-learning has received considerable attention as one approach to enable deep neural networks to learn from a few data. Recent results suggest that simply fine-tuning a pre-trained network may be more effective at learning new image classification tasks from limited data than more complicated meta-learning techniques such as MAML. This is surprising as the learning behavior of MAML mimics that of fine-tuning. We investigate this phenomenon and show that the pre-trained features are more diverse and discriminative than those learned by MAML and Reptile, which specialize for adaptation in low-data regimes of a similar data distribution as the one used for training. Due to this specialization and lack of diversity, MAML and Reptile may fail to generalize to out-of-distribution tasks whereas fine-tuning can fall back on the diversity of the learned features.
Author Information
Mike Huisman (Leiden University)
Jan van Rijn (Columbia University)
Aske Plaat (Leiden University)
More from the Same Authors
-
2021 : OpenML Benchmarking Suites »
Bernd Bischl · Giuseppe Casalicchio · Matthias Feurer · Pieter Gijsbers · Frank Hutter · Michel Lang · Rafael Gomes Mantovani · Jan van Rijn · Joaquin Vanschoren -
2022 : AutoML for Neural Network Robustness Verification »
Jan van Rijn -
2022 Poster: Meta-Album: Multi-domain Meta-Dataset for Few-Shot Image Classification »
Ihsan Ullah · Dustin Carrión-Ojeda · Sergio Escalera · Isabelle Guyon · Mike Huisman · Felix Mohr · Jan N. van Rijn · Haozhe Sun · Joaquin Vanschoren · Phan Anh Vu -
2018 : Meta Learning for Defaults - Symbolic Defaults »
Jan van Rijn