`

Timezone: »

 
Poster
Variational Bayesian Monte Carlo with Noisy Likelihoods
Luigi Acerbi

Thu Dec 10 09:00 AM -- 11:00 AM (PST) @ Poster Session 5 #1469

Variational Bayesian Monte Carlo (VBMC) is a recently introduced framework that uses Gaussian process surrogates to perform approximate Bayesian inference in models with black-box, non-cheap likelihoods. In this work, we extend VBMC to deal with noisy log-likelihood evaluations, such as those arising from simulation-based models. We introduce new 'global' acquisition functions, such as expected information gain (EIG) and variational interquantile range (VIQR), which are robust to noise and can be efficiently evaluated within the VBMC setting. In a novel, challenging, noisy-inference benchmark comprising of a variety of models with real datasets from computational and cognitive neuroscience, VBMC+VIQR achieves state-of-the-art performance in recovering the ground-truth posteriors and model evidence. In particular, our method vastly outperforms 'local' acquisition functions and other surrogate-based inference methods while keeping a small algorithmic cost. Our benchmark corroborates VBMC as a general-purpose technique for sample-efficient black-box Bayesian inference also with noisy models.

Author Information

Luigi Acerbi (University of Helsinki)

Assistant professor Luigi Acerbi leads the *Machine and Human Intelligence* group at the Department of Computer Science of the University of Helsinki. His research spans Bayesian machine learning and computational and cognitive neuroscience. He is member of the *Finnish Centre for Artificial Intelligence* (FCAI), of the *International Brain Laboratory*, of ELLIS (*European Laboratory for Learning and Intelligent Systems*), and an off-site visiting scholar at New York University.

More from the Same Authors