Timezone: »
Neural Posterior Estimation methods for simulation-based inference can be ill-suited for dealing with posterior distributions obtained by conditioning on multiple observations, as they may require a large number of simulator calls to yield accurate approximations. Neural Likelihood Estimation methods can naturally handle multiple observations, but require a separate inference step, which may affect their efficiency and performance. We introduce a new method for simulation-based inference that enjoys the benefits of both approaches. We propose to model the scores for the posterior distributions induced by individual observations, and introduce a sampling algorithm that combines the learned scores to approximately sample from the target efficiently.
Author Information
Tomas Geffner (University of Massachusetts, Amherst)
George Papamakarios (DeepMind)
Andriy Mnih (DeepMind)
More from the Same Authors
-
2021 : Gaussian dropout as an information bottleneck layer »
Melanie Rey · Andriy Mnih -
2022 : Deep End-to-end Causal Inference »
Tomas Geffner · Javier Antorán · Adam Foster · Wenbo Gong · Chao Ma · Emre Kiciman · Amit Sharma · Angus Lamb · Martin Kukla · Nick Pawlowski · Miltiadis Allamanis · Cheng Zhang -
2021 Poster: Coupled Gradient Estimators for Discrete Latent Variables »
Zhe Dong · Andriy Mnih · George Tucker -
2020 Poster: DisARM: An Antithetic Gradient Estimator for Binary Latent Variables »
Zhe Dong · Andriy Mnih · George Tucker -
2020 Spotlight: DisARM: An Antithetic Gradient Estimator for Binary Latent Variables »
Zhe Dong · Andriy Mnih · George Tucker -
2019 Poster: Neural Spline Flows »
Conor Durkan · Artur Bekasov · Iain Murray · George Papamakarios -
2018 Poster: Implicit Reparameterization Gradients »
Mikhail Figurnov · Shakir Mohamed · Andriy Mnih -
2018 Poster: Using Large Ensembles of Control Variates for Variational Inference »
Tomas Geffner · Justin Domke -
2018 Spotlight: Implicit Reparameterization Gradients »
Mikhail Figurnov · Shakir Mohamed · Andriy Mnih -
2017 Oral: Masked Autoregressive Flow for Density Estimation »
George Papamakarios · Iain Murray · Theo Pavlakou -
2017 Poster: Masked Autoregressive Flow for Density Estimation »
George Papamakarios · Iain Murray · Theo Pavlakou -
2017 Poster: REBAR: Low-variance, unbiased gradient estimates for discrete latent variable models »
George Tucker · Andriy Mnih · Chris J Maddison · John Lawson · Jascha Sohl-Dickstein -
2017 Oral: REBAR: Low-variance, unbiased gradient estimates for discrete latent variable models »
George Tucker · Andriy Mnih · Chris J Maddison · John Lawson · Jascha Sohl-Dickstein -
2017 Poster: Variational Memory Addressing in Generative Models »
Jörg Bornschein · Andriy Mnih · Daniel Zoran · Danilo Jimenez Rezende -
2017 Poster: Filtering Variational Objectives »
Chris Maddison · John Lawson · George Tucker · Nicolas Heess · Mohammad Norouzi · Andriy Mnih · Arnaud Doucet · Yee Teh -
2016 Poster: Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation »
George Papamakarios · Iain Murray -
2015 : *George Papamakarios* Distilling Intractable Generative Models »
George Papamakarios