Timezone: »
Given an unnormalized target distribution we want to obtain approximate samples from it and a tight lower bound on its (log) normalization constant log Z. Annealed Importance Sampling (AIS) with Hamiltonian MCMC is a powerful method that can be used to do this. Its main drawback is that it uses non-differentiable transition kernels, which makes tuning its many parameters hard. We propose a framework to use an AIS-like procedure with Uncorrected Hamiltonian MCMC, called Uncorrected Hamiltonian Annealing. Our method leads to tight and differentiable lower bounds on log Z. We show empirically that our method yields better performances than other competing approaches, and that the ability to tune its parameters using reparameterization gradients may lead to large performance improvements.
Author Information
Tomas Geffner (UMass Amherst)
Justin Domke (University of Massachusetts, Amherst)
More from the Same Authors
-
2021 Poster: Amortized Variational Inference for Simple Hierarchical Models »
Abhinav Agrawal · Justin Domke -
2020 Poster: Advances in Black-Box VI: Normalizing Flows, Importance Weighting, and Optimization »
Abhinav Agrawal · Daniel Sheldon · Justin Domke -
2020 Poster: Approximation Based Variance Reduction for Reparameterization Gradients »
Tomas Geffner · Justin Domke -
2019 Poster: Thompson Sampling and Approximate Inference »
My Phan · Yasin Abbasi Yadkori · Justin Domke -
2019 Poster: Provable Gradient Variance Guarantees for Black-Box Variational Inference »
Justin Domke -
2019 Poster: Divide and Couple: Using Monte Carlo Variational Objectives for Posterior Approximation »
Justin Domke · Daniel Sheldon -
2019 Spotlight: Divide and Couple: Using Monte Carlo Variational Objectives for Posterior Approximation »
Justin Domke · Daniel Sheldon -
2018 Poster: Using Large Ensembles of Control Variates for Variational Inference »
Tomas Geffner · Justin Domke -
2018 Poster: Importance Weighting and Variational Inference »
Justin Domke · Daniel Sheldon