Timezone: »
Deep Gaussian Processes learn probabilistic data representations for supervised learning by cascading multiple Gaussian Processes. While this model family promises flexible predictive distributions, exact inference is not tractable. Approximate inference techniques trade off the ability to closely resemble the posterior distribution against speed of convergence and computational efficiency. We propose a novel Gaussian variational family that allows for retaining covariances between latent processes while achieving fast convergence by marginalising out all global latent variables. After providing a proof of how this marginalisation can be done for general covariances, we restrict them to the ones we empirically found to be most important in order to also achieve computational efficiency. We provide an efficient implementation of our new approach and apply it to several benchmark datasets. It yields excellent results and strikes a better balance between accuracy and calibrated uncertainty estimates than its state-of-the-art alternatives.
Author Information
Jakob Lindinger (Bosch Center for Artificial Intelligence)
David Reeb (Bosch Center for Artificial Intelligence (BCAI))
Christoph Lippert (Hasso Plattner Institute for Digital Engineering, Universität Potsdam)
Barbara Rakitsch (Bosch Center for Artificial Intelligence)
More from the Same Authors
-
2020 : Learning Partially Known Stochastic Dynamics with Empirical PAC Bayes »
Manuel Haußmann · Sebastian Gerwinn · Andreas Look · Barbara Rakitsch · Melih Kandemir -
2022 : HAPNEST: An efficient tool for generating large-scale genetics datasets from limited training data »
Sophie Wharrie · Zhiyu Yang · Vishnu Raj · Remo Monti · Rahul Gupta · Ying Wang · Alicia Martin · Luke O'Connor · Samuel Kaski · Pekka Marttinen · Pier Palamara · Christoph Lippert · Andrea Ganna -
2022 Poster: Learning interacting dynamical systems with latent Gaussian process ODEs »
Çağatay Yıldız · Melih Kandemir · Barbara Rakitsch -
2020 Poster: 3D Self-Supervised Methods for Medical Imaging »
Aiham Taleb · Winfried Loetzsch · Noel Danz · Julius Severin · Thomas Gaertner · Benjamin Bergner · Christoph Lippert -
2018 Poster: Learning Gaussian Processes by Minimizing PAC-Bayesian Generalization Bounds »
David Reeb · Andreas Doerr · Sebastian Gerwinn · Barbara Rakitsch -
2013 Poster: It is all in the noise: Efficient multi-task Gaussian process inference with structured residuals »
Barbara Rakitsch · Christoph Lippert · Karsten Borgwardt · Oliver Stegle -
2011 Poster: Learning sparse inverse covariance matrices in the presence of confounders »
Oliver Stegle · Christoph Lippert · Joris M Mooij · Neil D Lawrence · Karsten Borgwardt