Skip to yearly menu bar Skip to main content


Poster

Metalearned Neural Memory

Tsendsuren Munkhdalai · Alessandro Sordoni · TONG WANG · Adam Trischler

East Exhibition Hall B + C #30

Keywords: [ Memor ] [ Algorithms -> Few-Shot Learning; Deep Learning -> Memory-Augmented Neural Networks; Neuroscience and Cognitive Science ] [ Meta-Learning ] [ Algorithms ]


Abstract:

We augment recurrent neural networks with an external memory mechanism that builds upon recent progress in metalearning. We conceptualize this memory as a rapidly adaptable function that we parameterize as a deep neural network. Reading from the neural memory function amounts to pushing an input (the key vector) through the function to produce an output (the value vector). Writing to memory means changing the function; specifically, updating the parameters of the neural network to encode desired information. We leverage training and algorithmic techniques from metalearning to update the neural memory function in one shot. The proposed memory-augmented model achieves strong performance on a variety of learning problems, from supervised question answering to reinforcement learning.

Live content is unavailable. Log in and register to view live content