Timezone: »

Neural Conservation Laws: A Divergence-Free Perspective
Jack Richter-Powell · Yaron Lipman · Ricky T. Q. Chen

Tue Nov 29 09:00 AM -- 11:00 AM (PST) @ Hall J #305

We investigate the parameterization of deep neural networks that by design satisfy the continuity equation, a fundamental conservation law. This is enabled by the observation that any solution of the continuity equation can be represented as a divergence-free vector field. We hence propose building divergence-free neural networks through the concept of differential forms, and with the aid of automatic differentiation, realize two practical constructions. As a result, we can parameterize pairs of densities and vector fields that always satisfy the continuity equation by construction, foregoing the need for extra penalty methods or expensive numerical simulation. Furthermore, we prove these models are universal and so can be used to represent any divergence-free vector field. Finally, we experimentally validate our approaches by computing neural network-based solutions to fluid equations, solving for the Hodge decomposition, and learning dynamical optimal transport maps.

Author Information

Jack Richter-Powell (Vector Institute)
Yaron Lipman (Meta AI, Weizmann Institute of Science)
Ricky T. Q. Chen (FAIR Labs, Meta AI)

More from the Same Authors