Skip to yearly menu bar Skip to main content


Talk
in
Workshop: Biological and Artificial Reinforcement Learning

Invited Talk #5 : Materials Matter: How biologically inspired alternatives to conventional neural networks improve meta-learning and continual learning

Jeff Clune

[ ]
2019 Talk

Abstract:

I will describe how alternatives to conventional neural networks that are very loosely biologically inspired can improve meta-learning, including continual learning. First I will summarize differentiable Hebbian learning and differentiable neuromodulated Hebbian learning (aka “backpropamine”). Both are techniques for training deep neural networks with synaptic plasticity, meaning the weights can change during meta-testing/inference. Whereas meta-learned RNNs can only store within-episode information in their activations, such plastic Hebbian networks can store information in their weights in addition to its activations, improving performance on some classes of problems. Second, I will describe a new, unpublished method that improves the state of the art in continual learning. ANML (A Neuromodulated Meta-Learning algorithm) meta-learns a neuromodulatory network that gates the activity of the main prediction network, enabling the learning of up to 600 simple tasks sequentially.

Live content is unavailable. Log in and register to view live content