Timezone: »
Human intelligence is characterized not only by the capacity to learn complex skills, but the ability to rapidly adapt and acquire new skills within an ever-changing environment. In this work we study how the learning of modular solutions can allow for effective generalization to both unseen and potentially differently distributed data. Our main postulate is that the combination of task segmentation, modular learning and memory-based ensembling can give rise to generalization on an exponentially growing number of unseen tasks. We provide a concrete instantiation of this idea using a combination of: (1) the Forget-Me-Not Process, for task segmentation and memory based ensembling; and (2) Gated Linear Networks, which in contrast to contemporary deep learning techniques use a modular and local learning mechanism. We demonstrate that this system exhibits a number of desirable continual learning properties: robustness to catastrophic forgetting, no negative transfer and increasing levels of positive transfer as more tasks are seen. We show competitive performance against both offline and online methods on standard continual learning benchmarks.
Author Information
Jianan Wang (DeepMind)
Eren Sezener (DeepMind)
David Budden (DeepMind)
Marcus Hutter (DeepMind)
Joel Veness (Deepmind)
More from the Same Authors
-
2023 Poster: Self-Predictive Universal AI »
Elliot Catt · Jordi Grau-Moya · Marcus Hutter · Matthew Aitchison · Tim Genewein · Grégoire Delétang · Li Kevin Wenliang · Joel Veness -
2023 Poster: DreamWaltz: Make a Scene with Complex 3D Animatable Avatars »
Yukun Huang · Jianan Wang · Ailing Zeng · He CAO · Xianbiao Qi · Yukai Shi · Zheng-Jun Zha · Lei Zhang -
2020 Poster: Modular Meta-Learning with Shrinkage »
Yutian Chen · Abram Friesen · Feryal Behbahani · Arnaud Doucet · David Budden · Matthew Hoffman · Nando de Freitas -
2020 Spotlight: Modular Meta-Learning with Shrinkage »
Yutian Chen · Abram Friesen · Feryal Behbahani · Arnaud Doucet · David Budden · Matthew Hoffman · Nando de Freitas -
2020 Poster: Online Learning in Contextual Bandits using Gated Linear Networks »
Eren Sezener · Marcus Hutter · David Budden · Jianan Wang · Joel Veness -
2020 Poster: Gaussian Gated Linear Networks »
David Budden · Adam Marblestone · Eren Sezener · Tor Lattimore · Gregory Wayne · Joel Veness -
2020 Poster: Logarithmic Pruning is All You Need »
Laurent Orseau · Marcus Hutter · Omar Rivasplata -
2020 Spotlight: Logarithmic Pruning is All You Need »
Laurent Orseau · Marcus Hutter · Omar Rivasplata -
2019 Poster: Learning Object Bounding Boxes for 3D Instance Segmentation on Point Clouds »
Bo Yang · Jianan Wang · Ronald Clark · Qingyong Hu · Sen Wang · Andrew Markham · Niki Trigoni -
2019 Spotlight: Learning Object Bounding Boxes for 3D Instance Segmentation on Point Clouds »
Bo Yang · Jianan Wang · Ronald Clark · Qingyong Hu · Sen Wang · Andrew Markham · Niki Trigoni -
2018 Poster: Playing hard exploration games by watching YouTube »
Yusuf Aytar · Tobias Pfaff · David Budden · Thomas Paine · Ziyu Wang · Nando de Freitas -
2018 Spotlight: Playing hard exploration games by watching YouTube »
Yusuf Aytar · Tobias Pfaff · David Budden · Thomas Paine · Ziyu Wang · Nando de Freitas