Skip to yearly menu bar Skip to main content

Workshop: Optimal Transport and Machine Learning

A generative flow model for conditional sampling via optimal transport

Jason Alfonso · Ricardo Baptista · Anupam Bhakta · Noam Gal · Alfin Hou · Vasilisa Lyubimova · Daniel Pocklington · Josef Sajonz · Giulio Trigila · Ryan Tsai

Abstract: Sampling conditional distributions is a fundamental task for Bayesian inference and density estimation. Generative models characterize conditionals by learning a transport map that pushes forward a reference (e.g., a standard Gaussian) to the target distribution. While these approaches successfully can describe many non-Gaussian problems, their performance is often limited by parametric bias and the reliability of gradient-based (adversarial) optimizers to learn the map. This work proposes a non-parametric generative model that adaptively maps reference samples to the target. The model uses block-triangular transport maps, whose components characterize conditionals of the target distribution. These maps arise from solving an optimal transport problem with a weighted $L^2$ cost function, thereby extending the data-driven approach in [Trigila and Tabak, 2016] for conditional sampling. The proposed approach is demonstrated on a low-dimensional example and a parameter inference problem involving nonlinear ODEs.

Chat is not available.