Poster
in
Workshop: Bayesian Decision-making and Uncertainty: from probabilistic and spatiotemporal modeling to sequential experiment design
Inverse-Free Sparse Variational Gaussian Processes
Stefano Cortinovis · Stefanos Eleftheriadis · Laurence Aitchison · James Hensman · Mark van der Wilk
Keywords: [ Gaussian Processes ] [ Variational Inference ] [ Natural Gradients ]
Abstract:
Gaussian processes (GPs) are a powerful prior over functions, but they require to invert/decompose the kernel matrix to perform inference, making them poorly suited to modern hardware. To address this, variational bounds that require only matmuls by introducing an additional variational parameter were proposed. However, in practice, the optimisation of with typical deep learning optimisers is challenging, limiting the practical utility of these bounds. In this work, we solve this by introducing a preconditioner for a variational parameter in the bound, a tailored update for based on natural gradients, and a stopping criterion to determine the number of updates. This leads to an inverse-free method on-par with existing approaches on an iteration basis, with low-precision computation and wall-clock speedups being the next step.
Chat is not available.