Timezone: »
Mechanistic models with differential equations are a key component of scientific applications of machine learning. Inference in such models is usually computationally demanding because it involves repeatedly solving the differential equation. The main problem here is that the numerical solver is hard to combine with standard inference techniques. Recent work in probabilistic numerics has developed a new class of solvers for ordinary differential equations (ODEs) that phrase the solution process directly in terms of Bayesian filtering. We here show that this allows such methods to be combined very directly, with conceptual and numerical ease, with latent force models in the ODE itself. It then becomes possible to perform approximate Bayesian inference on the latent force as well as the ODE solution in a single, linear complexity pass of an extended Kalman filter / smoother — that is, at the cost of computing a single ODE solution. We demonstrate the expressiveness and performance of the algorithm by training, among others, a non-parametric SIRD model on data from the COVID-19 outbreak.
Author Information
Jonathan Schmidt (University of Tuebingen)
Nicholas Krämer (University of Tuebingen)
Philipp Hennig (University of Tübingen and MPI Tübingen)
More from the Same Authors
-
2021 Spotlight: An Infinite-Feature Extension for Bayesian ReLU Nets That Fixes Their Asymptotic Overconfidence »
Agustinus Kristiadi · Matthias Hein · Philipp Hennig -
2021 : Mixtures of Laplace Approximations for Improved Post-Hoc Uncertainty in Deep Learning »
Runa Eschenhagen · Erik Daxberger · Philipp Hennig · Agustinus Kristiadi -
2021 : Being a Bit Frequentist Improves Bayesian Neural Networks »
Agustinus Kristiadi · Matthias Hein · Philipp Hennig -
2021 Poster: Laplace Redux - Effortless Bayesian Deep Learning »
Erik Daxberger · Agustinus Kristiadi · Alexander Immer · Runa Eschenhagen · Matthias Bauer · Philipp Hennig -
2021 Poster: An Infinite-Feature Extension for Bayesian ReLU Nets That Fixes Their Asymptotic Overconfidence »
Agustinus Kristiadi · Matthias Hein · Philipp Hennig -
2021 Poster: Linear-Time Probabilistic Solution of Boundary Value Problems »
Nicholas Krämer · Philipp Hennig -
2021 Poster: Cockpit: A Practical Debugging Tool for the Training of Deep Neural Networks »
Frank Schneider · Felix Dangel · Philipp Hennig -
2016 Workshop: Optimizing the Optimizers »
Maren Mahsereci · Alex Davies · Philipp Hennig -
2015 Workshop: Probabilistic Integration »
Michael A Osborne · Philipp Hennig -
2015 Poster: Probabilistic Line Searches for Stochastic Optimization »
Maren Mahsereci · Philipp Hennig -
2015 Oral: Probabilistic Line Searches for Stochastic Optimization »
Maren Mahsereci · Philipp Hennig -
2014 Poster: Incremental Local Gaussian Regression »
Franziska Meier · Philipp Hennig · Stefan Schaal -
2014 Poster: Probabilistic ODE Solvers with Runge-Kutta Means »
Michael Schober · David Duvenaud · Philipp Hennig -
2014 Poster: Sampling for Inference in Probabilistic Models with Fast Bayesian Quadrature »
Tom Gunter · Michael A Osborne · Roman Garnett · Philipp Hennig · Stephen J Roberts -
2014 Oral: Probabilistic ODE Solvers with Runge-Kutta Means »
Michael Schober · David Duvenaud · Philipp Hennig -
2013 Workshop: Bayesian Optimization in Theory and Practice »
Matthew Hoffman · Jasper Snoek · Nando de Freitas · Michael A Osborne · Ryan Adams · Sebastien Bubeck · Philipp Hennig · Remi Munos · Andreas Krause -
2013 Poster: The Randomized Dependence Coefficient »
David Lopez-Paz · Philipp Hennig · Bernhard Schölkopf -
2012 Workshop: Probabilistic Numerics »
Philipp Hennig · John P Cunningham · Michael A Osborne -
2011 Poster: Optimal Reinforcement Learning for Gaussian Systems »
Philipp Hennig