Timezone: »

 
On Feature Collapse and Deep Kernel Learning for Single Forward Pass Uncertainty
Joost van Amersfoort · Lewis Smith · Andrew Jesson · Oscar Key · Yarin Gal
Event URL: https://openreview.net/forum?id=VucvDPwFlM6 »

Inducing point Gaussian process approximations are often considered a gold standard in uncertainty estimation since they retain many of the properties of the exact GP and scale to large datasets. A major drawback is that they have difficulty scaling to high dimensional inputs. Deep Kernel Learning (DKL) promises a solution: a deep feature extractor transforms the inputs over which an inducing point Gaussian process is defined. However, DKL has been shown to provide unreliable uncertainty estimates in practice. We study why, and show that with no constraints, the DKL objective pushes ``far-away'' data points to be mapped to the same features as those of training-set points. With this insight we propose to constrain DKL's feature extractor to approximately preserve distances through a bi-Lipschitz constraint, resulting in a feature space favorable to DKL. We obtain a model, DUE, which demonstrates uncertainty quality outperforming previous DKL and other single forward pass uncertainty methods, while maintaining the speed and accuracy of standard neural networks.

Author Information

Joost van Amersfoort (University of Oxford)
Lewis Smith (University of Oxford)

Lewis Smith is a DPhil student supervised by Yarin Gal. His main interests are in the reliability and robustness of machine learning algorithms, Bayesian methods, and the utilisation of structure (such as invariances in the data). He is also a member of the [AIMS CDT](www.aims.robots.ox.ac.uk). Before joining OATML, he recieved his masters degree in physics from the University of Manchester.

Andrew Jesson (University of Oxford)
Oscar Key (University College London)
Yarin Gal (University of Oxford)

More from the Same Authors