Skip to yearly menu bar Skip to main content


Poster

Rényi Divergence Variational Inference

Yingzhen Li · Richard Turner

Area 5+6+7+8 #112

Keywords: [ Variational Inference ] [ Deep Learning or Neural Networks ]


Abstract:

This paper introduces the variational Rényi bound (VR) that extends traditional variational inference to Rényi's alpha-divergences. This new family of variational methods unifies a number of existing approaches, and enables a smooth interpolation from the evidence lower-bound to the log (marginal) likelihood that is controlled by the value of alpha that parametrises the divergence. The reparameterization trick, Monte Carlo approximation and stochastic optimisation methods are deployed to obtain a tractable and unified framework for optimisation. We further consider negative alpha values and propose a novel variational inference method as a new special case in the proposed framework. Experiments on Bayesian neural networks and variational auto-encoders demonstrate the wide applicability of the VR bound.

Live content is unavailable. Log in and register to view live content