Timezone: »
We examine the accuracy of black box variational posterior approximations for parametric models in a probabilistic programming context. The performance of these approximations depends on (1) how well the variational family approximates the true posterior distribution, (2) the choice of divergence, and (3) the optimization of the variational objective. We show that even when the true variational family is used, high-dimensional posteriors can be very poorly approximated using common stochastic gradient descent (SGD) optimizers. Motivated by recent theory, we propose a simple and parallel way to improve SGD estimates for variational inference. The approach is theoretically motivated and comes with a diagnostic for convergence and a novel stopping rule, which is robust to noisy objective functions evaluations. We show empirically, the new workflow works well on a diverse set of models and datasets, or warns if the stochastic optimization fails or if the used variational distribution is not good.
Author Information
Akash Kumar Dhaka (Aalto University)
Alejandro Catalina (Aalto University)
Michael Andersen (Technical University of Denmark)
Måns Magnusson (Aalto University)
Jonathan Huggins (Boston University)
Aki Vehtari (Aalto University)
More from the Same Authors
-
2021 : Make cross-validation Bayes again »
Yuling Yao · Aki Vehtari -
2022 : SolarDK: A high-resolution urban solar panel image classification and localization dataset »
Maxim Khomiakov · Julius Holbech Radzikowski · Carl Schmidt · Mathias Bonde Sørensen · Mads Andersen · Michael Andersen · Jes Frellsen -
2021 : Invited Talk 4 Q&A »
Jonathan Huggins -
2021 : Statistically Robust Inference with Stochastic Gradient Algorithms »
Jonathan Huggins -
2021 Poster: Challenges and Opportunities in High Dimensional Variational Inference »
Akash Kumar Dhaka · Alejandro Catalina · Manushi Welandawe · Michael Andersen · Jonathan Huggins · Aki Vehtari -
2020 Poster: Hamiltonian Monte Carlo using an adjoint-differentiated Laplace approximation: Bayesian inference for latent Gaussian models and beyond »
Charles Margossian · Aki Vehtari · Daniel Simpson · Raj Agrawal -
2018 : Software Panel »
Ben Letham · David Duvenaud · Dustin Tran · Aki Vehtari -
2018 Poster: Random Feature Stein Discrepancies »
Jonathan Huggins · Lester Mackey -
2017 : Poster Spotlights »
Francesco Locatello · Ari Pakman · Da Tang · Thomas Rainforth · Zalan Borsos · Marko Järvenpää · Eric Nalisnick · Gabriele Abbati · XIAOYU LU · Jonathan Huggins · Rachit Singh · Rui Luo -
2017 Poster: PASS-GLM: polynomial approximate sufficient statistics for scalable Bayesian GLM inference »
Jonathan Huggins · Ryan Adams · Tamara Broderick -
2017 Spotlight: PASS-GLM: polynomial approximate sufficient statistics for scalable Bayesian GLM inference »
Jonathan Huggins · Ryan Adams · Tamara Broderick -
2016 Poster: Coresets for Scalable Bayesian Logistic Regression »
Jonathan Huggins · Trevor Campbell · Tamara Broderick -
2009 Poster: Gaussian process regression with Student-t likelihood »
Jarno Vanhatalo · Pasi Jylänki · Aki Vehtari