Timezone: »
Wasserstein gradient flows provide a powerful means of understanding and solving many diffusion equations. Specifically, Fokker-Planck equations, which model the diffusion of probability measures, can be understood as gradient descent over entropy functionals in Wasserstein space. This equivalence, introduced by Jordan, Kinderlehrer and Otto, inspired the so-called JKO scheme to approximate these diffusion processes via an implicit discretization of the gradient flow in Wasserstein space. Solving the optimization problem associated with each JKO step, however, presents serious computational challenges. We introduce a scalable method to approximate Wasserstein gradient flows, targeted to machine learning applications. Our approach relies on input-convex neural networks (ICNNs) to discretize the JKO steps, which can be optimized by stochastic gradient descent. Contrarily to previous work, our method does not require domain discretization or particle simulation. As a result, we can sample from the measure at each time step of the diffusion and compute its probability density. We demonstrate the performance of our algorithm by computing diffusions following the Fokker-Planck equation and apply it to unnormalized density sampling as well as nonlinear filtering.
Author Information
Petr Mokrov (Skolkovo Institute of Science and Technology, Moscow Institute of Physics and Technology)
Alexander Korotin (Skolkovo Institute of Science and Technology)
Lingxiao Li (MIT)
Aude Genevay (MIT)
Justin Solomon (MIT)
Evgeny Burnaev (Skoltech)
Evgeny is an experienced scientist working at the interface between machine learning and applied engineering problems. He obtained his Master’s degree in Applied Physics and Mathematics from the Moscow Institute of Physics and Technology in 2006. After successfully defending his PhD thesis in Foundations of Computer Science at the Institute for Information Transmission Problem RAS (IITP RAS) in 2008, Evgeny stayed with the Institute as a head of IITP Data Analysis and Modeling group. Today, Evgeny’s research interests encompass the areas of regression based on Gaussian Processes, bootstrap, confidence sets and conformal predictors, volatility modeling and nonparametric estimation, statistical decisions and rapid detection of anomalies in complex multicomponent systems. Evgeny always demonstrated a deep fundamental knowledge and engineer-like thinking that enabled him to effectively use methods of statistics, machine learning and predictive modeling to deal with practical tasks in hi-tech industries, primarily aerospace, medicine and life sciences. He carried out a number of successful industrial projects with Airbus, Eurocopter and Sahara Force India Formula 1 team among others. The corresponding data analysis algorithms, developed by Evgeny and his group at IITP, formed a core of the algorithmic software library for surrogate modeling and optimization. Thanks to the developed functionality, engineers can construct fast mathematical approximations to long running computer codes (realizing physical models) based on available data and perform design space exploration for trade-off studies. The software library passed the final Technology Readiness Level 6 certification in Airbus. According to Airbus experts, application of the library “provides the reduction of up to 10% of lead time and cost in several areas of the aircraft design process”. Nowadays several dozens of Airbus departments use it. Later a spin-off company developed a Software platform for Design Space Exploration with GUI based on this algorithmic core. Evgeny has also a considerable teaching experience both in Russian and English. He has developed and taught various undergraduate and graduate courses in applied mathematics at MIPT, IITP, Yandex School of Data Analysis and the Humboldt University of Berlin, as well as mini courses on application of machine learning in engineering multidisciplinary modeling and optimization for technological companies such as Astrium, Safran, SAFT, CNES, etc. Before joining Skoltech, Evgeny was a Lecturer at Yandex School of Data Analysis, Associate Professor and Vice Chairman of Information Transmission Problems and Data Analysis Chair at MIPT, data analysis expert at DATADVANCE llc., and head of IITP Data Analysis and Predictive Modeling Lab. At Skoltech, Evgeny is actively engaged in the development of CDISE educational and research programs, and continues his research in the areas of development of theoretical tools for estimation of change-point algorithms’ performance, effective algorithms for anomaly detection and failures prediction, analysis of their properties, and development of a core library for anomaly detection and failures prediction.
More from the Same Authors
-
2022 Poster: Wasserstein Iterative Networks for Barycenter Estimation »
Alexander Korotin · Vage Egiazarian · Lingxiao Li · Evgeny Burnaev -
2022 Poster: Kantorovich Strikes Back! Wasserstein GANs are not Optimal Transport? »
Alexander Korotin · Alexander Kolesov · Evgeny Burnaev -
2022 Spotlight: Lightning Talks 6B-2 »
Alexander Korotin · Jinyuan Jia · Weijian Deng · Shi Feng · Maying Shen · Denizalp Goktas · Fang-Yi Yu · Alexander Kolesov · Sadie Zhao · Stephen Gould · Hongxu Yin · Wenjie Qu · Liang Zheng · Evgeny Burnaev · Amy Greenwald · Neil Gong · Pavlo Molchanov · Yiling Chen · Lei Mao · Jianna Liu · Jose M. Alvarez -
2022 Spotlight: Kantorovich Strikes Back! Wasserstein GANs are not Optimal Transport? »
Alexander Korotin · Alexander Kolesov · Evgeny Burnaev -
2022 Spotlight: Lightning Talks 2A-4 »
Sarthak Mittal · Richard Grumitt · Zuoyu Yan · Lihao Wang · Dongsheng Wang · Alexander Korotin · Jiangxin Sun · Ankit Gupta · Vage Egiazarian · Tengfei Ma · Yi Zhou · Yishi Xu · Albert Gu · Biwei Dai · Chunyu Wang · Yoshua Bengio · Uros Seljak · Miaoge Li · Guillaume Lajoie · Yiqun Wang · Liangcai Gao · Lingxiao Li · Jonathan Berant · Huang Hu · Xiaoqing Zheng · Zhibin Duan · Hanjiang Lai · Evgeny Burnaev · Zhi Tang · Zhi Jin · Xuanjing Huang · Chaojie Wang · Yusu Wang · Jian-Fang Hu · Bo Chen · Chao Chen · Hao Zhou · Mingyuan Zhou -
2022 Spotlight: Wasserstein Iterative Networks for Barycenter Estimation »
Alexander Korotin · Vage Egiazarian · Lingxiao Li · Evgeny Burnaev -
2021 Poster: Manifold Topology Divergence: a Framework for Comparing Data Manifolds. »
Serguei Barannikov · Ilya Trofimov · Grigorii Sotnikov · Ekaterina Trimbach · Alexander Korotin · Alexander Filippov · Evgeny Burnaev -
2021 Poster: BooVAE: Boosting Approach for Continual Learning of VAE »
Evgenii Egorov · Anna Kuzina · Evgeny Burnaev -
2021 Poster: Object DGCNN: 3D Object Detection using Dynamic Graphs »
Yue Wang · Justin Solomon -
2021 Poster: MarioNette: Self-Supervised Sprite Learning »
Dmitriy Smirnov · MICHAEL GHARBI · Matthew Fisher · Vitor Guizilini · Alexei Efros · Justin Solomon -
2021 Poster: Do Neural Optimal Transport Solvers Work? A Continuous Wasserstein-2 Benchmark »
Alexander Korotin · Lingxiao Li · Aude Genevay · Justin Solomon · Alexander Filippov · Evgeny Burnaev -
2020 Poster: Continuous Regularized Wasserstein Barycenters »
Lingxiao Li · Aude Genevay · Mikhail Yurochkin · Justin Solomon -
2019 : Poster Session »
Lili Yu · Aleksei Kroshnin · Alex Delalande · Andrew Carr · Anthony Tompkins · Aram-Alexandre Pooladian · Arnaud Robert · Ashok Vardhan Makkuva · Aude Genevay · Bangjie Liu · Bo Zeng · Charlie Frogner · Elsa Cazelles · Esteban G Tabak · Fabio Ramos · François-Pierre PATY · Georgios Balikas · Giulio Trigila · Hao Wang · Hinrich Mahler · Jared Nielsen · Karim Lounici · Kyle Swanson · Mukul Bhutani · Pierre Bréchet · Piotr Indyk · samuel cohen · Stefanie Jegelka · Tao Wu · Thibault Sejourne · Tudor Manole · Wenjun Zhao · Wenlin Wang · Wenqi Wang · Yonatan Dukler · Zihao Wang · Chaosheng Dong -
2019 : Aude Genevay »
Aude Genevay -
2019 Poster: PRNet: Self-Supervised Learning for Partial-to-Partial Registration »
Yue Wang · Justin Solomon -
2019 Poster: Alleviating Label Switching with Optimal Transport »
Pierre Monteiller · Sebastian Claici · Edward Chien · Farzaneh Mirzazadeh · Justin Solomon · Mikhail Yurochkin -
2019 Poster: Hierarchical Optimal Transport for Document Representation »
Mikhail Yurochkin · Sebastian Claici · Edward Chien · Farzaneh Mirzazadeh · Justin Solomon -
2017 Poster: Parallel Streaming Wasserstein Barycenters »
Matt Staib · Sebastian Claici · Justin Solomon · Stefanie Jegelka -
2017 Tutorial: A Primer on Optimal Transport »
Marco Cuturi · Justin Solomon