Skip to yearly menu bar Skip to main content


Poster
in
Affinity Workshop: Women in Machine Learning

Gaussian Process parameterized Covariance Kernels for Non-stationary Regression

Vidhi Lalchand · Talay Cheema · Laurence Aitchison · Carl Edward Rasmussen


Abstract:

A large cross-section of Gaussian process literature uses universal kernels like the squared exponential (SE) kernel along with automatic revelance determination (ARD) in high-dimensions. The ARD framework in covariance kernels operates by pruning away extraneous dimensions through contracting their inverse-lengthscales. This works considers probabilistic inference in the factorised Gibbs kernel (FGK) [Gibbs, 1998] and the multivariate Gibbs kernel (MGK) [Paciorek, 2003] with input-dependent lengthscales. These kernels allow for non-stationary modelling where samples from the posterior function space "adapt" to the varying smoothness structure inherent in the ground truth. We propose parameterizing the lengthscale function of the factorised and multivariate Gibbs covariance function with a latent Gaussian process defined on the same inputs. For large datasets, we show how these non-stationary constructions are compatible with sparse inducing variable formulations for regression. Experiments on synthetic and real-world spatial datasets for precipitation modelling and temperature trends demonstrate the feasibility and utility of the approach.

Chat is not available.