Timezone: »
Variational inference is increasingly being addressed with stochastic optimization. In this setting, the gradient's variance plays a crucial role in the optimization procedure, since high variance gradients lead to poor convergence. A popular approach used to reduce gradient's variance involves the use of control variates. Despite the good results obtained, control variates developed for variational inference are typically looked at in isolation. In this paper we clarify the large number of control variates that are available by giving a systematic view of how they are derived. We also present a Bayesian risk minimization framework in which the quality of a procedure for combining control variates is quantified by its effect on optimization convergence rates, which leads to a very simple combination rule. Results show that combining a large number of control variates this way significantly improves the convergence of inference over using the typical gradient estimators or a reduced number of control variates.
Author Information
Tomas Geffner (University of Massachusetts, Amherst)
Justin Domke (University of Massachusetts, Amherst)
More from the Same Authors
-
2022 : Score Modeling for Simulation-based Inference »
Tomas Geffner · George Papamakarios · Andriy Mnih -
2022 : Deep End-to-end Causal Inference »
Tomas Geffner · Javier AntorĂ¡n · Adam Foster · Wenbo Gong · Chao Ma · Emre Kiciman · Amit Sharma · Angus Lamb · Martin Kukla · Nick Pawlowski · Miltiadis Allamanis · Cheng Zhang -
2023 Poster: Provable convergence guarantees for black-box variational inference »
Justin Domke · Robert Gower · Guillaume Garrigos -
2023 Poster: Discriminative Calibration »
Yuling Yao · Justin Domke -
2021 Poster: MCMC Variational Inference via Uncorrected Hamiltonian Annealing »
Tomas Geffner · Justin Domke -
2021 Poster: Amortized Variational Inference for Simple Hierarchical Models »
Abhinav Agrawal · Justin Domke -
2020 Poster: Advances in Black-Box VI: Normalizing Flows, Importance Weighting, and Optimization »
Abhinav Agrawal · Daniel Sheldon · Justin Domke -
2020 Poster: Approximation Based Variance Reduction for Reparameterization Gradients »
Tomas Geffner · Justin Domke -
2019 Poster: Thompson Sampling and Approximate Inference »
My Phan · Yasin Abbasi Yadkori · Justin Domke -
2019 Poster: Provable Gradient Variance Guarantees for Black-Box Variational Inference »
Justin Domke -
2019 Poster: Divide and Couple: Using Monte Carlo Variational Objectives for Posterior Approximation »
Justin Domke · Daniel Sheldon -
2019 Spotlight: Divide and Couple: Using Monte Carlo Variational Objectives for Posterior Approximation »
Justin Domke · Daniel Sheldon -
2018 Poster: Importance Weighting and Variational Inference »
Justin Domke · Daniel Sheldon