Timezone: »
We introduce Unbalanced Sobolev Descent (USD), a particle descent algorithm for transporting a high dimensional source distribution to a target distribution that does not necessarily have the same mass. We define the Sobolev-Fisher discrepancy between distributions and show that it relates to advection-reaction transport equations and the Wasserstein-Fisher-Rao metric between distributions. USD transports particles along gradient flows of the witness function of the Sobolev-Fisher discrepancy (advection step) and reweighs the mass of particles with respect to this witness function (reaction step). The reaction step can be thought of as a birth-death process of the particles with rate of growth proportional to the witness function. When the Sobolev-Fisher witness function is estimated in a Reproducing Kernel Hilbert Space (RKHS), under mild assumptions we show that USD converges asymptotically (in the limit of infinite particles) to the target distribution in the Maximum Mean Discrepancy (MMD) sense. We then give two methods to estimate the Sobolev-Fisher witness with neural networks, resulting in two Neural USD algorithms. The first one implements the reaction step with mirror descent on the weights, while the second implements it through a birth-death process of particles. We show on synthetic examples that USD transports distributions with or without conservation of mass faster than previous particle descent algorithms, and finally demonstrate its use for molecular biology analyses where our method is naturally suited to match developmental stages of populations of differentiating cells based on their single-cell RNA sequencing profile. Code is available at http://github.com/ibm/usd.
Author Information
Youssef Mroueh (IBM T.J Watson Research Center)
Mattia Rigotti (IBM Research AI)
More from the Same Authors
-
2021 Spotlight: Measuring Generalization with Optimal Transport »
Ching-Yao Chuang · Youssef Mroueh · Kristjan Greenewald · Antonio Torralba · Stefanie Jegelka -
2021 : Optimizing Functionals on the Space of Probabilities with Input Convex Neural Network »
David Alvarez-Melis · Yair Schiff · Youssef Mroueh -
2021 : Optimizing Functionals on the Space of Probabilities with Input Convex Neural Network »
David Alvarez-Melis · Yair Schiff · Youssef Mroueh -
2022 Poster: Compositional generalization through abstract representations in human and artificial neural networks »
Takuya Ito · Tim Klinger · Doug Schultz · John Murray · Michael Cole · Mattia Rigotti -
2021 Poster: Measuring Generalization with Optimal Transport »
Ching-Yao Chuang · Youssef Mroueh · Kristjan Greenewald · Antonio Torralba · Stefanie Jegelka -
2021 Poster: Separation Results between Fixed-Kernel and Feature-Learning Probability Metrics »
Carles Domingo i Enrich · Youssef Mroueh -
2021 Oral: Separation Results between Fixed-Kernel and Feature-Learning Probability Metrics »
Carles Domingo i Enrich · Youssef Mroueh -
2020 Poster: A Decentralized Parallel Algorithm for Training Generative Adversarial Nets »
Mingrui Liu · Wei Zhang · Youssef Mroueh · Xiaodong Cui · Jarret Ross · Tianbao Yang · Payel Das -
2019 Poster: Sobolev Independence Criterion »
Youssef Mroueh · Tom Sercu · Mattia Rigotti · Inkit Padhi · Cicero Nogueira dos Santos -
2017 Poster: Fisher GAN »
Youssef Mroueh · Tom Sercu