Skip to yearly menu bar Skip to main content


Poster

GumBolt: Extending Gumbel trick to Boltzmann priors

Amir H Khoshaman · Mohammad Amin

Room 210 #8

Keywords: [ Variational Inference ] [ Graphical Models ] [ Generative Models ] [ Hierarchical Models ] [ Latent Variable Models ] [ Efficient Training Methods ]


Abstract:

Boltzmann machines (BMs) are appealing candidates for powerful priors in variational autoencoders (VAEs), as they are capable of capturing nontrivial and multi-modal distributions over discrete variables. However, non-differentiability of the discrete units prohibits using the reparameterization trick, essential for low-noise back propagation. The Gumbel trick resolves this problem in a consistent way by relaxing the variables and distributions, but it is incompatible with BM priors. Here, we propose the GumBolt, a model that extends the Gumbel trick to BM priors in VAEs. GumBolt is significantly simpler than the recently proposed methods with BM prior and outperforms them by a considerable margin. It achieves state-of-the-art performance on permutation invariant MNIST and OMNIGLOT datasets in the scope of models with only discrete latent variables. Moreover, the performance can be further improved by allowing multi-sampled (importance-weighted) estimation of log-likelihood in training, which was not possible with previous models.

Chat is not available.