Timezone: »

Beyond the Mean-Field: Structured Deep Gaussian Processes Improve the Predictive Uncertainties
Jakob Lindinger · David Reeb · Christoph Lippert · Barbara Rakitsch

Thu Dec 10 09:00 AM -- 11:00 AM (PST) @ Poster Session 5 #1471

Deep Gaussian Processes learn probabilistic data representations for supervised learning by cascading multiple Gaussian Processes. While this model family promises flexible predictive distributions, exact inference is not tractable. Approximate inference techniques trade off the ability to closely resemble the posterior distribution against speed of convergence and computational efficiency. We propose a novel Gaussian variational family that allows for retaining covariances between latent processes while achieving fast convergence by marginalising out all global latent variables. After providing a proof of how this marginalisation can be done for general covariances, we restrict them to the ones we empirically found to be most important in order to also achieve computational efficiency. We provide an efficient implementation of our new approach and apply it to several benchmark datasets. It yields excellent results and strikes a better balance between accuracy and calibrated uncertainty estimates than its state-of-the-art alternatives.

Author Information

Jakob Lindinger (Bosch Center for Artificial Intelligence)
David Reeb (Bosch Center for Artificial Intelligence (BCAI))
Christoph Lippert (Hasso Plattner Institute for Digital Engineering, Universität Potsdam)
Barbara Rakitsch (Bosch Center for Artificial Intelligence)

More from the Same Authors