Skip to yearly menu bar Skip to main content


Poster

Neural Networks with Cheap Differential Operators

Tian Qi Chen · David Duvenaud

East Exhibition Hall B + C #56

Keywords: [ G ] [ Algorithms -> Density Estimation; Algorithms -> Uncertainty Estimation; Algorithms -> Unsupervised Learning; Deep Learning ] [ Deep Learning ]


Abstract:

Gradients of neural networks can be computed efficiently for any architecture, but some applications require computing differential operators with higher time complexity. We describe a family of neural network architectures that allow easy access to a family of differential operators involving \emph{dimension-wise derivatives}, and we show how to modify the backward computation graph to compute them efficiently. We demonstrate the use of these operators for solving root-finding subproblems in implicit ODE solvers, exact density evaluation for continuous normalizing flows, and evaluating the Fokker-Planck equation for training stochastic differential equation models.

Live content is unavailable. Log in and register to view live content