Skip to yearly menu bar Skip to main content


Poster

A Geometric Perspective on Variational Autoencoders

ClĂ©ment Chadebec · Stephanie Allassonniere

Hall J (level 1) #915

Keywords: [ latent space modeling ] [ Riemannian geometry ] [ Variational Autoencoders ]


Abstract:

This paper introduces a new interpretation of the Variational Autoencoder framework by taking a fully geometric point of view. We argue that vanilla VAE models unveil naturally a Riemannian structure in their latent space and that taking into consideration those geometrical aspects can lead to better interpolations and an improved generation procedure. This new proposed sampling method consists in sampling from the uniform distribution deriving intrinsically from the learned Riemannian latent space and we show that using this scheme can make a vanilla VAE competitive and even better than more advanced versions on several benchmark datasets. Since generative models are known to be sensitive to the number of training samples we also stress the method's robustness in the low data regime.

Chat is not available.