Timezone: »

 
Poster
Neural Ordinary Differential Equations
Tian Qi Chen · Yulia Rubanova · Jesse Bettencourt · David Duvenaud

Tue Dec 04 02:00 PM -- 04:00 PM (PST) @ Room 210 #3

We introduce a new family of deep neural network models. Instead of specifying a discrete sequence of hidden layers, we parameterize the derivative of the hidden state using a neural network. The output of the network is computed using a blackbox differential equation solver. These continuous-depth models have constant memory cost, adapt their evaluation strategy to each input, and can explicitly trade numerical precision for speed. We demonstrate these properties in continuous-depth residual networks and continuous-time latent variable models. We also construct continuous normalizing flows, a generative model that can train by maximum likelihood, without partitioning or ordering the data dimensions. For training, we show how to scalably backpropagate through any ODE solver, without access to its internal operations. This allows end-to-end training of ODEs within larger models.

Author Information

Tian Qi Chen (University of Toronto)
Yulia Rubanova (University of Toronto)
Jesse Bettencourt (University of Toronto)
David Duvenaud (University of Toronto)

David Duvenaud is an assistant professor in computer science at the University of Toronto. His research focuses on continuous-time models, latent-variable models, and deep learning. His postdoc was done at Harvard University, and his Ph.D. at the University of Cambridge. David also co-founded Invenia, an energy forecasting and trading company.

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors