Timezone: »
In many supervised learning tasks, learning what changes do not affect the predic-tion target is as crucial to generalisation as learning what does. Data augmentationis a common way to enforce a model to exhibit an invariance: training data is modi-fied according to an invariance designed by a human and added to the training data.We argue that invariances should be incorporated the model structure, and learnedusing themarginal likelihood, which can correctly reward the reduced complexityof invariant models. We incorporate invariances in a Gaussian process, due to goodmarginal likelihood approximations being available for these models. Our maincontribution is a derivation for a variational inference scheme for invariant Gaussianprocesses where the invariance is described by a probability distribution that canbe sampled from, much like how data augmentation is implemented in practice
Author Information
Mark van der Wilk (PROWLER.io)
Matthias Bauer (Max Planck Institute for Intelligent Systems)
ST John (PROWLER.io)
James Hensman (PROWLER.io)
More from the Same Authors
-
2021 Spotlight: Speedy Performance Estimation for Neural Architecture Search »
Robin Ru · Clare Lyle · Lisa Schut · Miroslav Fil · Mark van der Wilk · Yarin Gal -
2021 Poster: Speedy Performance Estimation for Neural Architecture Search »
Robin Ru · Clare Lyle · Lisa Schut · Miroslav Fil · Mark van der Wilk · Yarin Gal -
2021 Poster: Deep Neural Networks as Point Estimates for Deep Gaussian Processes »
Vincent Dutordoir · James Hensman · Mark van der Wilk · Carl Henrik Ek · Zoubin Ghahramani · Nicolas Durrande -
2019 Poster: Bayesian Layers: A Module for Neural Network Uncertainty »
Dustin Tran · Mike Dusenberry · Mark van der Wilk · Danijar Hafner -
2019 Poster: Pseudo-Extended Markov chain Monte Carlo »
Christopher Nemeth · Fredrik Lindsten · Maurizio Filippone · James Hensman -
2019 Poster: Scalable Bayesian dynamic covariance modeling with variational Wishart and inverse Wishart processes »
Creighton Heaukulani · Mark van der Wilk -
2018 Poster: Gaussian Process Conditional Density Estimation »
Vincent Dutordoir · Hugh Salimbeni · James Hensman · Marc Deisenroth -
2018 Poster: Infinite-Horizon Gaussian Processes »
Arno Solin · James Hensman · Richard Turner -
2017 Poster: Convolutional Gaussian Processes »
Mark van der Wilk · Carl Edward Rasmussen · James Hensman -
2017 Oral: Convolutional Gaussian Processes »
Mark van der Wilk · Carl Edward Rasmussen · James Hensman -
2017 Poster: Identification of Gaussian Process State Space Models »
Stefanos Eleftheriadis · Tom Nicholson · Marc Deisenroth · James Hensman -
2016 Poster: Understanding Probabilistic Sparse Gaussian Process Approximations »
Matthias Bauer · Mark van der Wilk · Carl Edward Rasmussen -
2015 Poster: MCMC for Variationally Sparse Gaussian Processes »
James Hensman · Alexander Matthews · Maurizio Filippone · Zoubin Ghahramani -
2014 Poster: Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models »
Yarin Gal · Mark van der Wilk · Carl Edward Rasmussen -
2013 Workshop: Probabilistic Models for Big Data »
Neil D Lawrence · Joaquin QuiƱonero-Candela · Tianshi Gao · James Hensman · Zoubin Ghahramani · Max Welling · David Blei · Ralf Herbrich -
2013 Session: Tutorial Session A »
James Hensman -
2012 Poster: Fast Variational Inference in the Conjugate Exponential Family »
James Hensman · Magnus Rattray · Neil D Lawrence