Timezone: »

 
Poster
Learning Differential Equations that are Easy to Solve
Jacob Kelly · Jesse Bettencourt · Matthew Johnson · David Duvenaud

Tue Dec 08 09:00 AM -- 11:00 AM (PST) @ Poster Session 1 #176

Differential equations parameterized by neural networks become expensive to solve numerically as training progresses. We propose a remedy that encourages learned dynamics to be easier to solve. Specifically, we introduce a differentiable surrogate for the time cost of standard numerical solvers, using higher-order derivatives of solution trajectories. These derivatives are efficient to compute with Taylor-mode automatic differentiation. Optimizing this additional objective trades model performance against the time cost of solving the learned dynamics. We demonstrate our approach by training substantially faster, while nearly as accurate, models in supervised classification, density estimation, and time-series modelling tasks.

Author Information

Jacob Kelly (University of Toronto)

I'm an undergrad in CS, Math, and Stats at the University of Toronto. I'm currently a ML Research Intern at Deep Genomics. I'm also very fortunate to work with David Duvenaud at the Vector Institute. I'm interested in latent variable models, neural ODEs, variational inference, and genomics. My long term research goal is to combine machine learning with novel sources of data to develop new tools for improved diagnosis and treatment of patients.

Jesse Bettencourt (University of Toronto)
Matthew Johnson (Google Brain)

Matt Johnson is a research scientist at Google Brain interested in software systems powering machine learning research. He is the tech lead for JAX, a system for composable function transformations in Python. He was a postdoc at Harvard University with Ryan Adams, working on composing graphical models with neural networks and applications in neurobiology. His Ph.D. is from MIT, where he worked with Alan Willsky on Bayesian nonparametrics, time series models, and scalable inference.

David Duvenaud (University of Toronto)

David Duvenaud is an assistant professor in computer science at the University of Toronto. His research focuses on continuous-time models, latent-variable models, and deep learning. His postdoc was done at Harvard University, and his Ph.D. at the University of Cambridge. David also co-founded Invenia, an energy forecasting and trading company.

More from the Same Authors