Skip to yearly menu bar Skip to main content


Poster

Recurrent Ladder Networks

Isabeau PrĂ©mont-Schwarz · Alexander Ilin · Tele Hao · Antti Rasmus · Rinu Boney · Harri Valpola

Pacific Ballroom #111

Keywords: [ Deep Learning ] [ Recurrent Networks ]


Abstract:

We propose a recurrent extension of the Ladder networks whose structure is motivated by the inference required in hierarchical latent variable models. We demonstrate that the recurrent Ladder is able to handle a wide variety of complex learning tasks that benefit from iterative inference and temporal modeling. The architecture shows close-to-optimal results on temporal modeling of video data, competitive results on music modeling, and improved perceptual grouping based on higher order abstractions, such as stochastic textures and motion cues. We present results for fully supervised, semi-supervised, and unsupervised tasks. The results suggest that the proposed architecture and principles are powerful tools for learning a hierarchy of abstractions, learning iterative inference and handling temporal information.

Live content is unavailable. Log in and register to view live content