Many probabilistic models of interest in scientific computing and machine learning have expensive, black-box likelihoods that prevent the application of standard techniques for Bayesian inference, such as MCMC, which would require access to the gradient or a large number of likelihood evaluations. We introduce here a novel sample-efficient inference framework, Variational Bayesian Monte Carlo (VBMC). VBMC combines variational inference with Gaussian-process based, active-sampling Bayesian quadrature, using the latter to efficiently approximate the intractable integral in the variational objective. Our method produces both a nonparametric approximation of the posterior distribution and an approximate lower bound of the model evidence, useful for model selection. We demonstrate VBMC both on several synthetic likelihoods and on a neuronal model with data from real neurons. Across all tested problems and dimensions (up to D = 10), VBMC performs consistently well in reconstructing the posterior and the model evidence with a limited budget of likelihood evaluations, unlike other methods that work only in very low dimensions. Our framework shows great promise as a novel tool for posterior and model inference with expensive, black-box likelihoods.
Luigi Acerbi (University of Geneva)
Assistant professor Luigi Acerbi leads the *Machine and Human Intelligence* group at the Department of Computer Science of the University of Helsinki. His research spans Bayesian machine learning and computational and cognitive neuroscience. He is member of the *Finnish Centre for Artificial Intelligence* (FCAI) and of ELLIS (*European Laboratory for Learning and Intelligent Systems*).
More from the Same Authors
2023 Poster: Practical Equivariances via Relational Conditional Neural Processes »
Daolang Huang · Manuel Haussmann · Ulpu Remes · ST John · Grégoire Clarté · Kevin Sebastian Luck · Samuel Kaski · Luigi Acerbi
2023 Poster: Learning Robust Statistics for Simulation-based Inference under Model Misspecification »
Daolang Huang · Ayush Bharti · Amauri Souza · Luigi Acerbi · Samuel Kaski
2020 Poster: Variational Bayesian Monte Carlo with Noisy Likelihoods »
2020 Poster: Dynamic allocation of limited memory resources in reinforcement learning »
Nisheet Patel · Luigi Acerbi · Alexandre Pouget
2017 Poster: Practical Bayesian Optimization for Model Fitting with Bayesian Adaptive Direct Search »
Luigi Acerbi · Wei Ji
2014 Poster: A Framework for Testing Identifiability of Bayesian Models of Perception »
Luigi Acerbi · Wei Ji Ma · Sethu Vijayakumar