Skip to yearly menu bar Skip to main content


Poster

Robust Gaussian Processes via Relevance Pursuit

Sebastian Ament · Ben Letham · David Eriksson · Elizabeth Santorella · Maximilian Balandat · Eytan Bakshy

East Exhibit Hall A-C #4000
[ ]
Fri 13 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Gaussian Processes (GPs) are non-parametric probabilistic regression models that are popular due to their flexibility, data efficiency, and well-calibrated uncertainty estimates. However, in their standard form, GPs assume a homoskedastic Gaussian noise model, while many real-world applications are subject to non-Gaussian noise. Examples include random failures in experiments or heavy-tailed noise in stochastic experiments common to machine learning, perception science, and financial applications. Variants of GPs that are more robust to alternative noise models have been proposed, but require significant trade-offs between nominal performance, degree of robustness, computational requirements, and theoretical guarantees. In this work, we propose and study a GP model that achieves robustness against sparse outliers by simply inferring data-point-specific noise levels with a sequential selection procedure maximizing the log marginal likelihood that we refer to as \emph{relevance pursuit}. Surprisingly, we show that a particular parameterization of the model leads the log marginal likelihood to be {\it strongly concave} in the data-point-specific noise variances, a property rarely found in either robust regression objectives or GP marginal likelihoods, which further implies the weak submodularity of the objective, and thereby proves approximation guarantees for the proposed algorithm. We compare the model's performance relative to other approaches on diverse regression and Bayesian optimization tasks, including the challenging but common setting of sparse corruptions of the labels within the function range.

Live content is unavailable. Log in and register to view live content