Timezone: »
Generative models can be seen as the swiss army knives of machine learning, as many problems can be written probabilistically in terms of the distribution of the data, including prediction, reconstruction, imputation and simulation. One of the most promising directions for unsupervised learning may lie in Deep Learning methods, given their success in supervised learning. However, one of the current problems with deep unsupervised learning methods, is that they often are harder to scale. As a result there are some easier, more scalable shallow methods, such as the Gaussian Mixture Model and the Student-t Mixture Model, that remain surprisingly competitive. In this paper we propose a new scalable deep generative model for images, called the Deep Gaussian Mixture Model, that is a straightforward but powerful generalization of GMMs to multiple layers. The parametrization of a Deep GMM allows it to efficiently capture products of variations in natural images. We propose a new EM-based algorithm that scales well to large datasets, and we show that both the Expectation and the Maximization steps can easily be distributed over multiple machines. In our density estimation experiments we show that deeper GMM architectures generalize better than more shallow ones, with results in the same ballpark as the state of the art.
Author Information
Aaron van den Oord (n/a)
Benjamin Schrauwen (Ghent University)
More from the Same Authors
-
2013 Poster: Training and Analysing Deep Recurrent Neural Networks »
Michiel Hermans · Benjamin Schrauwen -
2013 Demonstration: Deep Content-Based Music Recommendation »
Aaron van den Oord · Sander Dieleman · Benjamin Schrauwen -
2013 Poster: Deep content-based music recommendation »
Aaron van den Oord · Sander Dieleman · Benjamin Schrauwen -
2012 Demonstration: A Fast Accurate Training-less P300 Speller: Unsupervised Learning Uncovers new Possibilities »
Pieter-Jan Kindermans · Hannes Verschore · David Verstraeten · Benjamin Schrauwen -
2012 Poster: A P300 BCI for the Masses: Prior Information Enables Instant Unsupervised Spelling »
Pieter-Jan Kindermans · Hannes Verschore · David Verstraeten · Benjamin Schrauwen