Skip to yearly menu bar Skip to main content


Poster

Glow: Generative Flow with Invertible 1x1 Convolutions

Diederik Kingma · Prafulla Dhariwal

Room 210 #11

Keywords: [ Latent Variable Models ] [ Representation Learning ] [ Generative Models ] [ Density Estimation ] [ Nonlinear Dimensionality Reduction and Manifold Learning ] [ Unsupervised Learning ]


Abstract:

Flow-based generative models are conceptually attractive due to tractability of the exact log-likelihood, tractability of exact latent-variable inference, and parallelizability of both training and synthesis. In this paper we propose Glow, a simple type of generative flow using invertible 1x1 convolution. Using our method we demonstrate a significant improvement in log-likelihood and qualitative sample quality. Perhaps most strikingly, we demonstrate that a generative model optimized towards the plain log-likelihood objective is capable of efficient synthesis of large and subjectively realistic-looking images.

Live content is unavailable. Log in and register to view live content