Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Learning-Based Solutions for Inverse Problems

Improved Black-box Variational Inference for High-dimensional Bayesian Inversion involving Black-box Simulators

Dhruv Patel · Jonghyun Lee · Matthew Farthing · Tyler Hesser · Peter Kitanidis · Eric Darve

Keywords: [ Variational Inference ] [ High-dimensional modeling. ] [ Black-box Simulators ] [ Bayesian inversion ] [ Deep Generative Priors ]


Abstract:

Black-box forward model simulators are widely used in scientific and engineering domains for their exceptional capability to mimic complex physical systems. However, applying current state-of-the-art gradient-based Bayesian inference techniques like Hamiltonian Monte Carlo or Variational Inference with them becomes infeasible due to the opaque nature of these simulators. We address this challenge by introducing a modular approach that combines black-box variational inference (BBVI) with deep generative priors, making it possible to efficiently and accurately perform high-dimensional Bayesian inversion in these settings. Our method introduces a novel gradient correction term and a sampling strategy for BBVI, which collectively diminish gradient errors by several orders of magnitude across different dimensions, even with minimal batch sizes. Furthermore, integrating our method with Generative Adversarial Network (GAN)-based priors significantly enhances the solution of high-dimensional inverse problems. We validate our algorithm's effectiveness on a range of physics-based inverse problems using both simulated and experimental data. In comparison to Markov Chain Monte Carlo (MCMC) methods, our approach consistently delivers superior accuracy and substantial improvements in both statistical and computational efficiency, often by an order of magnitude.

Chat is not available.