Timezone: »
Parametric stochastic simulators are ubiquitous in science, often featuring high-dimensional input parameters and/or an intractable likelihood. Performing Bayesian parameter inference in this context can be challenging. We present a neural simulation-based inference algorithm which simultaneously offers simulation efficiency and fast empirical posterior testability, which is unique among modern algorithms. Our approach is simulation efficient by simultaneously estimating low-dimensional marginal posteriors instead of the joint posterior and by proposing simulations targeted to an observation of interest via a prior suitably truncated by an indicator function. Furthermore, by estimating a locally amortized posterior our algorithm enables efficient empirical tests of the robustness of the inference results. Since scientists cannot access the ground truth, these tests are necessary for trusting inference in real-world applications. We perform experiments on a marginalized version of the simulation-based inference benchmark and two complex and narrow posteriors, highlighting the simulator efficiency of our algorithm as well as the quality of the estimated marginal posteriors.
Author Information
Benjamin K Miller (University of Amsterdam)
Alex Cole (University of Amsterdam)
Patrick Forré (University of Amsterdam)
Gilles Louppe (University of Liège)
Christoph Weniger (University of Amsterdam)
More from the Same Authors
-
2021 : Automatically detecting anomalous exoplanet transits »
Christoph Hönes · Benjamin K Miller -
2021 : Probing the Structure of String Theory Vacua with Genetic Algorithms and Reinforcement Learning »
Andreas Schachner · Sven Krippendorf · Alex Cole · Gary Shiu -
2022 : Strong-Lensing Source Reconstruction with Denoising Diffusion Restoration Models »
Konstantin Karchev · Noemi Anau Montel · Adam Coogan · Christoph Weniger -
2022 : Normalizing Flows for Hierarchical Bayesian Analysis: A Gravitational Wave Population Study »
David Ruhe · Kaze Wong · Miles Cranmer · Patrick Forré -
2022 : Detection is truncation: studying source populations with truncated marginal neural ratio estimation »
Noemi Anau Montel · Christoph Weniger -
2022 : Physics-informed inference of animal movements from weather radar data »
Fiona Lippert · Patrick Forré -
2022 : Towards architectural optimization of equivariant neural networks over subgroups »
Kaitlin Maile · Dennis Wilson · Patrick Forré -
2022 Poster: Contrastive Neural Ratio Estimation »
Benjamin K Miller · Christoph Weniger · Patrick Forré -
2021 Poster: An Information-theoretic Approach to Distribution Shifts »
Marco Federici · Ryota Tomioka · Patrick Forré -
2021 Poster: HNPE: Leveraging Global Parameters for Neural Posterior Estimation »
Pedro Rodrigues · Thomas Moreau · Gilles Louppe · Alexandre Gramfort -
2021 Poster: Argmax Flows and Multinomial Diffusion: Learning Categorical Distributions »
Emiel Hoogeboom · Didrik Nielsen · Priyank Jaini · Patrick Forré · Max Welling -
2021 Poster: From global to local MDI variable importances for random forests and when they are Shapley values »
Antonio Sutera · Gilles Louppe · Van Anh Huynh-Thu · Louis Wehenkel · Pierre Geurts -
2017 : Panel session »
Iain Murray · Max Welling · Juan Carrasquilla · Anatole von Lilienfeld · Gilles Louppe · Kyle Cranmer -
2017 : Invited talk 2: Adversarial Games for Particle Physics »
Gilles Louppe -
2017 Poster: Learning to Pivot with Adversarial Networks »
Gilles Louppe · Michael Kagan · Kyle Cranmer -
2013 Poster: Understanding variable importances in forests of randomized trees »
Gilles Louppe · Louis Wehenkel · Antonio Sutera · Pierre Geurts -
2013 Spotlight: Understanding variable importances in forests of randomized trees »
Gilles Louppe · Louis Wehenkel · Antonio Sutera · Pierre Geurts