Timezone: »

Neural Networks with Cheap Differential Operators
Tian Qi Chen · David Duvenaud

Tue Dec 10 10:45 AM -- 12:45 PM (PST) @ East Exhibition Hall B + C #56

Gradients of neural networks can be computed efficiently for any architecture, but some applications require computing differential operators with higher time complexity. We describe a family of neural network architectures that allow easy access to a family of differential operators involving \emph{dimension-wise derivatives}, and we show how to modify the backward computation graph to compute them efficiently. We demonstrate the use of these operators for solving root-finding subproblems in implicit ODE solvers, exact density evaluation for continuous normalizing flows, and evaluating the Fokker-Planck equation for training stochastic differential equation models.

Author Information

Tian Qi Chen (U of Toronto)
David Duvenaud (University of Toronto)

David Duvenaud is an assistant professor in computer science at the University of Toronto. His research focuses on continuous-time models, latent-variable models, and deep learning. His postdoc was done at Harvard University, and his Ph.D. at the University of Cambridge. David also co-founded Invenia, an energy forecasting and trading company.

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors