`

Timezone: »

 
Invited Talk #5 : Materials Matter: How biologically inspired alternatives to conventional neural networks improve meta-learning and continual learning
Jeff Clune

Fri Dec 13 02:30 PM -- 03:00 PM (PST) @

I will describe how alternatives to conventional neural networks that are very loosely biologically inspired can improve meta-learning, including continual learning. First I will summarize differentiable Hebbian learning and differentiable neuromodulated Hebbian learning (aka “backpropamine”). Both are techniques for training deep neural networks with synaptic plasticity, meaning the weights can change during meta-testing/inference. Whereas meta-learned RNNs can only store within-episode information in their activations, such plastic Hebbian networks can store information in their weights in addition to its activations, improving performance on some classes of problems. Second, I will describe a new, unpublished method that improves the state of the art in continual learning. ANML (A Neuromodulated Meta-Learning algorithm) meta-learns a neuromodulatory network that gates the activity of the main prediction network, enabling the learning of up to 600 simple tasks sequentially.

Author Information

Jeff Clune (Uber AI Labs)

Jeff is a senior research scientist and founding member of Uber AI Labs. He is also the Loy and Edith Harris Associate Professor in Computer Science University of Wyoming, where he directs the Evolving AI Lab (http://EvolvingAI.org). He researches robotics and creating artificial intelligence in neural networks, either via deep learning or evolutionary algorithms.

More from the Same Authors