Timezone: »

Learning Attractor Dynamics for Generative Memory
Yan Wu · Gregory Wayne · Karol Gregor · Timothy Lillicrap

Wed Dec 05 07:45 AM -- 09:45 AM (PST) @ Room 210 #33

A central challenge faced by memory systems is the robust retrieval of a stored pattern in the presence of interference due to other stored patterns and noise. A theoretically well-founded solution to robust retrieval is given by attractor dynamics, which iteratively cleans up patterns during recall. However, incorporating attractor dynamics into modern deep learning systems poses difficulties: attractor basins are characterised by vanishing gradients, which are known to make training neural networks difficult. In this work, we exploit recent advances in variational inference and avoid the vanishing gradient problem by training a generative distributed memory with a variational lower-bound-based Lyapunov function. The model is minimalistic with surprisingly few parameters. Experiments shows it converges to correct patterns upon iterative retrieval and achieves competitive performance as both a memory model and a generative model.

Author Information

Yan Wu (DeepMind)
Gregory Wayne (Google DeepMind)
Karol Gregor (DeepMind)
Timothy Lillicrap (Google DeepMind)

More from the Same Authors