Timezone: »
We consider the fundamental problem of sampling the optimal transport coupling between given source and target distributions. In certain cases, the optimal transport plan takes the form of a one-to-one mapping from the source support to the target support, but learning or even approximating such a map is computationally challenging for large and high-dimensional datasets due to the high cost of linear programming routines and an intrinsic curse of dimensionality. We study instead the Sinkhorn problem, a regularized form of optimal transport whose solutions are couplings between the source and the target distribution. We introduce a novel framework for learning the Sinkhorn coupling between two distributions in the form of a score-based generative model. Conditioned on source data, our procedure iterates Langevin Dynamics to sample target data according to the regularized optimal coupling. Key to this approach is a neural network parametrization of the Sinkhorn problem, and we prove convergence of gradient descent with respect to network parameters in this formulation. We demonstrate its empirical success on a variety of large scale optimal transport tasks.
Author Information
Grady Daniels (Northeastern University)
Tyler Maunu (Brandeis University)
Paul Hand (Northeastern University)
More from the Same Authors
-
2021 Workshop: Workshop on Deep Learning and Inverse Problems »
Reinhard Heckel · Paul Hand · Rebecca Willett · christopher metzler · Mahdi Soltanolkotabi -
2020 : Opening Remarks »
Reinhard Heckel · Paul Hand · Soheil Feizi · Lenka Zdeborová · Richard Baraniuk -
2020 Workshop: Workshop on Deep Learning and Inverse Problems »
Reinhard Heckel · Paul Hand · Richard Baraniuk · Lenka Zdeborová · Soheil Feizi -
2020 : Newcomer presentation »
Reinhard Heckel · Paul Hand -
2020 Poster: Exponential ergodicity of mirror-Langevin diffusions »
Sinho Chewi · Thibaut Le Gouic · Chen Lu · Tyler Maunu · Philippe Rigollet · Austin Stromme -
2020 Poster: SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence »
Sinho Chewi · Thibaut Le Gouic · Chen Lu · Tyler Maunu · Philippe Rigollet -
2020 Poster: Nonasymptotic Guarantees for Spiked Matrix Recovery with Generative Priors »
Jorio Cocola · Paul Hand · Vlad Voroninski -
2019 : Opening Remarks »
Reinhard Heckel · Paul Hand · Alex Dimakis · Joan Bruna · Deanna Needell · Richard Baraniuk -
2019 Workshop: Solving inverse problems with deep networks: New architectures, theoretical foundations, and applications »
Reinhard Heckel · Paul Hand · Richard Baraniuk · Joan Bruna · Alex Dimakis · Deanna Needell -
2019 Poster: Global Guarantees for Blind Demodulation with Generative Priors »
Paul Hand · Babhru Joshi -
2018 Poster: A convex program for bilinear inversion of sparse vectors »
Alireza Aghasi · Ali Ahmed · Paul Hand · Babhru Joshi -
2018 Poster: Blind Deconvolutional Phase Retrieval via Convex Programming »
Ali Ahmed · Alireza Aghasi · Paul Hand -
2018 Spotlight: Blind Deconvolutional Phase Retrieval via Convex Programming »
Ali Ahmed · Alireza Aghasi · Paul Hand -
2018 Poster: Phase Retrieval Under a Generative Prior »
Paul Hand · Oscar Leong · Vlad Voroninski -
2018 Oral: Phase Retrieval Under a Generative Prior »
Paul Hand · Oscar Leong · Vlad Voroninski