Skip to yearly menu bar Skip to main content


Poster

Learning Attractor Dynamics for Generative Memory

Yan Wu · Gregory Wayne · Karol Gregor · Timothy Lillicrap

Room 210 #33

Keywords: [ Generative Models ] [ Dynamical Systems ] [ Memory-Augmented Neural Networks ]


Abstract:

A central challenge faced by memory systems is the robust retrieval of a stored pattern in the presence of interference due to other stored patterns and noise. A theoretically well-founded solution to robust retrieval is given by attractor dynamics, which iteratively cleans up patterns during recall. However, incorporating attractor dynamics into modern deep learning systems poses difficulties: attractor basins are characterised by vanishing gradients, which are known to make training neural networks difficult. In this work, we exploit recent advances in variational inference and avoid the vanishing gradient problem by training a generative distributed memory with a variational lower-bound-based Lyapunov function. The model is minimalistic with surprisingly few parameters. Experiments shows it converges to correct patterns upon iterative retrieval and achieves competitive performance as both a memory model and a generative model.

Live content is unavailable. Log in and register to view live content