Timezone: »
We investigate the parameterization of deep neural networks that by design satisfy the continuity equation, a fundamental conservation law. This is enabled by the observation that any solution of the continuity equation can be represented as a divergence-free vector field. We hence propose building divergence-free neural networks through the concept of differential forms, and with the aid of automatic differentiation, realize two practical constructions. As a result, we can parameterize pairs of densities and vector fields that always satisfy the continuity equation by construction, foregoing the need for extra penalty methods or expensive numerical simulation. Furthermore, we prove these models are universal and so can be used to represent any divergence-free vector field. Finally, we experimentally validate our approaches by computing neural network-based solutions to fluid equations, solving for the Hodge decomposition, and learning dynamical optimal transport maps.
Author Information
Jack Richter-Powell (Vector Institute)
Yaron Lipman (Meta AI, Weizmann Institute of Science)
Ricky T. Q. Chen (FAIR Labs, Meta AI)
More from the Same Authors
-
2021 : Input Convex Gradient Networks »
Jack Richter-Powell · Jonathan Lorraine · Brandon Amos -
2021 : Input Convex Gradient Networks »
Jack Richter-Powell · Jonathan Lorraine · Brandon Amos -
2023 Poster: Task-driven metric learning for end-to-end model learning »
Dishank Bansal · Ricky T. Q. Chen · Mustafa Mukadam · Brandon Amos -
2022 Poster: Semi-Discrete Normalizing Flows through Differentiable Tessellation »
Ricky T. Q. Chen · Brandon Amos · Maximilian Nickel -
2022 Poster: VisCo Grids: Surface Reconstruction with Viscosity and Coarea Grids »
Albert Pumarola · Artsiom Sanakoyeu · Lior Yariv · Ali Thabet · Yaron Lipman -
2022 Poster: Theseus: A Library for Differentiable Nonlinear Optimization »
Luis Pineda · Taosha Fan · Maurizio Monge · Shobha Venkataraman · Paloma Sodhi · Ricky T. Q. Chen · Joseph Ortiz · Daniel DeTone · Austin Wang · Stuart Anderson · Jing Dong · Brandon Amos · Mustafa Mukadam -
2021 Oral: Moser Flow: Divergence-based Generative Modeling on Manifolds »
Noam Rozen · Aditya Grover · Maximilian Nickel · Yaron Lipman -
2021 Oral: Volume Rendering of Neural Implicit Surfaces »
Lior Yariv · Jiatao Gu · Yoni Kasten · Yaron Lipman -
2021 Poster: Moser Flow: Divergence-based Generative Modeling on Manifolds »
Noam Rozen · Aditya Grover · Maximilian Nickel · Yaron Lipman -
2021 Poster: Volume Rendering of Neural Implicit Surfaces »
Lior Yariv · Jiatao Gu · Yoni Kasten · Yaron Lipman -
2020 Poster: Set2Graph: Learning Graphs From Sets »
Hadar Serviansky · Nimrod Segol · Jonathan Shlomi · Kyle Cranmer · Eilam Gross · Haggai Maron · Yaron Lipman -
2020 Poster: Multiview Neural Surface Reconstruction by Disentangling Geometry and Appearance »
Lior Yariv · Yoni Kasten · Dror Moran · Meirav Galun · Matan Atzmon · Basri Ronen · Yaron Lipman -
2020 Spotlight: Multiview Neural Surface Reconstruction by Disentangling Geometry and Appearance »
Lior Yariv · Yoni Kasten · Dror Moran · Meirav Galun · Matan Atzmon · Basri Ronen · Yaron Lipman -
2019 Poster: Controlling Neural Level Sets »
Matan Atzmon · Niv Haim · Lior Yariv · Ofer Israelov · Haggai Maron · Yaron Lipman -
2019 Poster: Provably Powerful Graph Networks »
Haggai Maron · Heli Ben-Hamu · Hadar Serviansky · Yaron Lipman -
2018 Poster: (Probably) Concave Graph Matching »
Haggai Maron · Yaron Lipman -
2018 Spotlight: (Probably) Concave Graph Matching »
Haggai Maron · Yaron Lipman