Timezone: »
In the Gaussian process regression the observation model is commonly assumed to be Gaussian, which is convenient in computational perspective. However, the drawback is that the predictive accuracy of the model can be significantly compromised if the observations are contaminated by outliers. A robust observation model, such as the Student-t distribution, reduces the influence of outlying observations and improves the predictions. The problem, however, is the analytically intractable inference. In this work, we discuss the properties of a Gaussian process regression model with the Student-t likelihood and utilize the Laplace approximation for approximate inference. We compare our approach to a variational approximation and a Markov chain Monte Carlo scheme, which utilize the commonly used scale mixture representation of the Student-t distribution.
Author Information
Jarno Vanhatalo (Helsinki University of Technology)
Pasi Jylänki (G-Research)
Aki Vehtari (Aalto University)
More from the Same Authors
-
2021 : Make cross-validation Bayes again »
Yuling Yao · Aki Vehtari -
2021 Poster: Challenges and Opportunities in High Dimensional Variational Inference »
Akash Kumar Dhaka · Alejandro Catalina · Manushi Welandawe · Michael Andersen · Jonathan Huggins · Aki Vehtari -
2020 Poster: Robust, Accurate Stochastic Optimization for Variational Inference »
Akash Kumar Dhaka · Alejandro Catalina · Michael Andersen · Måns Magnusson · Jonathan Huggins · Aki Vehtari -
2020 Poster: Hamiltonian Monte Carlo using an adjoint-differentiated Laplace approximation: Bayesian inference for latent Gaussian models and beyond »
Charles Margossian · Aki Vehtari · Daniel Simpson · Raj Agrawal -
2018 : Software Panel »
Ben Letham · David Duvenaud · Dustin Tran · Aki Vehtari