University of Cambridge
Advances in Gaussian Processes
9:30 - 11:30am Monday, December 04, 2006
Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kernel machines. Although these models have a long history in statistics, their potential has only become widely appreciated in the machine learning community during the past decade. This tutorial will introduce GPs, their application to regression and classification, and outline recent computational developments. GPs are a natural framework for Bayesian inference about functions, providing full predictive distributions and a principled framework for inference, including model selection. The prior over functions is given in a hierarchical form, where the covariance function (or kernel) controls the properties of the functions in a way which allows interpretation of the model. Whereas inference in the simplest regression case can be done in closed form, inference in classification models is intractable. Several approximations have been proposed, e.g. the Expectation Propagation algorithm. A central limitation in the applicability of GPs to problems with large numbers of examples is that naïve implementations scale with the square and cube of the number of examples for memory and time respectively, making direct treatment of more than a few thousand cases inconvenient. Recent work on sparse approximations addresses these issues.
Carl Edward Rasmussen is a junior research group leader at the Max Planck Institute for Biological Cybernetics in Tübingen, Germany, in the department of Empirical Inference for Machine Learning and Perception, where he does research on Bayesian inference and machine learning. He received his Masters in Engineering from the Technical University of Denmark, and his PhD in Computer Science from the University of Toronto in 1996. Since then he has been a post doc at the Technical University of Denmark, and a senior research fellow at the Gatsby Computational Neuroscience Unit at University College London from 2000-2002. He has worked extensively on Gaussian process models and recently co-authored Rasmussen & Williams "Gaussian Processes for Machine Learning", the MIT Press (2006).