Timezone: »
Gaussian processes (GP) are Bayesian nonparametric models for continuous functions which allow for uncertainty quantification, interpretability, and the incorporation of expert knowledge. The theory and practice of GPs have flourished in the last decade, where researchers have looked into the expressiveness and efficiency of GP-based models and practitioners have applied them to a plethora of disciplines. This tutorial presents both the foundational theory and modern developments of data modelling using GPs, following step by step intuitions, illustrations and real-world examples. The tutorial will start with emphasis on the building blocks of the GP model, to then move onto the choice of the kernel function, cost-effective training strategies and non-Gaussian extensions. The second part of the tutorial will showcase more recent advances, such as latent variable models, deep GPs, current trends on kernel design and connections between GPs and deep neural networks. We hope that this exhibition, featuring classic and contemporary GP works, inspires attendees to incorporate GPs into their applications and motivates them to continue learning and contributing to the current developments in the field.
Mon 9:00 a.m. - 9:05 a.m.
|
Live Intro
(Intro)
SlidesLive Video » Welcome message, presentation of the speakers and brief overview of the talk. |
Felipe Tobar · César Lincoln Mattos 🔗 |
Mon 9:05 a.m. - 9:20 a.m.
|
Fundamentals of Bayesian Inference
(Talk)
SlidesLive Video » We revise the main components of Bayesian inference underpinning the treatment of the Gaussian process. Adopting an illustrative approach, we review statistical models, the approaches to learning the model parameters and the effect of the parameters’ cardinality. |
Felipe Tobar 🔗 |
Mon 9:20 a.m. - 9:35 a.m.
|
From One to Finitely-Many Gaussian RVs
(Talk)
SlidesLive Video » We motivate the use of Gaussian random variables (RVs) from conceptual and practical perspectives. We examine the main properties of the Gaussian model and show how Gaussian RVs can be combined to construct Gaussian vectors |
Felipe Tobar 🔗 |
Mon 9:35 a.m. - 9:50 a.m.
|
Q&A
SlidesLive Video » |
🔗 |
Mon 9:50 a.m. - 10:10 a.m.
|
Infinitely-many Gaussian RVs: The Gaussian process
(Talk)
SlidesLive Video » We study the covariance function and its marginalisation properties to motivate the existence of an infinite collection of Gaussian RVs, that is, the Gaussian process. We then present the prior and posterior GP in an illustrative manner. |
Felipe Tobar 🔗 |
Mon 10:10 a.m. - 10:25 a.m.
|
Choosing the Kernel and Learning its Parameters
(Talk)
SlidesLive Video » Building on the use of a Gaussian likelihood, we present a learning objective and refer to the problem of choosing a kernel and optimising its parameters. We revise different kernel functions and their properties. |
Felipe Tobar 🔗 |
Mon 10:25 a.m. - 10:35 a.m.
|
Implementation of a GP
(Talk)
SlidesLive Video » We show how to implement a vanilla GP from scratch, we refer to the particulars of coding the kernel, the likelihood function and the train() function. We apply this minimal GP toolbox to a couple of datasets to illustrate the GP’s ease of use and modelling abilities |
Felipe Tobar 🔗 |
Mon 10:35 a.m. - 10:50 a.m.
|
Q&A
SlidesLive Video » |
🔗 |
Mon 10:50 a.m. - 11:00 a.m.
|
Break
|
🔗 |
Mon 11:00 a.m. - 11:15 a.m.
|
Beyond Gaussian Likelihood
(Talk)
SlidesLive Video » We expand the applicability of GP models by introducing non-Gaussian likelihoods. We show why inference becomes intractable and how it can be approximated. Via the use of illustrations, we focus on the binary classification learning scenario, warping functions, and heteroscedastic models. |
César Lincoln Mattos 🔗 |
Mon 11:15 a.m. - 11:37 a.m.
|
Sparse Approximations
(Talk)
SlidesLive Video » Standard GP presents scalability issues, thus, in this section we present how sparse approximations enable the use of GPs for large datasets. We consider, in particular, the variational approach based on inducing variables, which results in tractable approximations for performing predictions and model selection via optimization of the evidence lower bound (ELBO). |
César Lincoln Mattos 🔗 |
Mon 11:37 a.m. - 11:52 a.m.
|
Q&A
SlidesLive Video » |
🔗 |
Mon 11:52 a.m. - 12:05 p.m.
|
Current Trends on Kernel Design
(Talk)
SlidesLive Video » We highlight some recent advances to kernel design that enhance the use of GP models in complex data settings. From automatic composition of simple kernels to deep kernel learning and (graph) convolutional kernels, the works revised in this section indicate how fast-paced the research on kernel design is. |
César Lincoln Mattos 🔗 |
Mon 12:05 p.m. - 12:27 p.m.
|
From GPLVM to Deep GPs
(Talk)
SlidesLive Video » We consider the unsupervised setting, more specifically, the task of nonlinear dimensionality reduction. We summarize the GP latent variable model (GPLVM) and indicate how it can be used as a building block for other generative models, such as the deep GP and other flexible probabilistic models. |
César Lincoln Mattos 🔗 |
Mon 12:27 p.m. - 12:42 p.m.
|
Multioutput GPs
(Talk)
SlidesLive Video » We motivate the need for MOGPs and show how they can be constructed by mixing independent GPs. Then, we revise standard approaches to covariance design and their implications. We conclude this section with real-world examples. |
Felipe Tobar 🔗 |
Mon 12:42 p.m. - 12:47 p.m.
|
Concluding Remarks
(Talk)
SlidesLive Video » Final remarks: the take-home message, what has been left out of the tutorial and proposed avenues for further study |
Felipe Tobar · César Lincoln Mattos 🔗 |
Mon 12:47 p.m. - 1:00 p.m.
|
Q&A
SlidesLive Video » |
🔗 |
Author Information
César Lincoln Mattos (Federal University of Ceará)
César Lincoln Cavalcante Mattos is an associate professor at the Department of Computer Science, at Federal University of Ceará (UFC), Brazil. He is also an associate researcher at the Logics and Artificial Intelligence Group (LOGIA). He has research interests in the broad fields of machine learning and probabilistic modeling, such as Gaussian processes, deep (probabilistic) learning, approximate inference and system identification. He has been applying learning methods in several research and development collaborations in areas such as dynamical system modeling, health risk analysis, software repository mining and anomaly detection.
Felipe Tobar (Universidad de Chile)
Felipe Tobar is an Assistant Professor at the Data & AI Initiative at Universidad de Chile. He holds Researcher positions at the Center for Mathematical Modeling and the Advanced Center for Electrical Engineering. Felipe received the BSc/MSc degrees in Electrical Engineering (U. de Chile, 2010) and a PhD in Signal Processing (Imperial College London, 2014), and he was an Associate Researcher in Machine Learning at the University of Cambridge (2014-2015). Felipe teaches Statistics and Machine Learning courses at undergraduate, graduate and professional levels. His research interests lie in the interface between Machine Learning and Statistical Signal Processing, including Gaussian processes, spectral estimation, approximate inference, Bayesian nonparametrics, and optimal transport.
More from the Same Authors
-
2022 : Deep Mahalanobis Gaussian Process »
Daniel Augusto de Souza · Diego Mesquita · César Lincoln Mattos · João Paulo Gomes -
2021 Poster: A novel notion of barycenter for probability distributions based on optimal weak mass transport »
Elsa Cazelles · Felipe Tobar · Joaquin Fontbona -
2021 : Concluding Remarks »
Felipe Tobar · César Lincoln Mattos -
2021 : Multioutput GPs »
Felipe Tobar -
2021 : From GPLVM to Deep GPs »
César Lincoln Mattos -
2021 : Current Trends on Kernel Design »
César Lincoln Mattos -
2021 : Sparse Approximations »
César Lincoln Mattos -
2021 : Beyond Gaussian Likelihood »
César Lincoln Mattos -
2021 : Implementation of a GP »
Felipe Tobar -
2021 : Choosing the Kernel and Learning its Parameters »
Felipe Tobar -
2021 : Infinitely-many Gaussian RVs: The Gaussian process »
Felipe Tobar -
2021 : From One to Finitely-Many Gaussian RVs »
Felipe Tobar -
2021 : Fundamentals of Bayesian Inference »
Felipe Tobar -
2021 : Live Intro »
Felipe Tobar · César Lincoln Mattos -
2019 Poster: Band-Limited Gaussian Processes: The Sinc Kernel »
Felipe Tobar -
2018 Poster: Bayesian Nonparametric Spectral Estimation »
Felipe Tobar -
2018 Spotlight: Bayesian Nonparametric Spectral Estimation »
Felipe Tobar -
2017 Poster: Spectral Mixture Kernels for Multi-Output Gaussian Processes »
Gabriel Parra · Felipe Tobar -
2015 : Design of Covariance Functions using Inter-Domain Inducing Variables »
Felipe Tobar -
2015 Poster: Learning Stationary Time Series using Gaussian Processes with Nonparametric Kernels »
Felipe Tobar · Thang Bui · Richard Turner -
2015 Spotlight: Learning Stationary Time Series using Gaussian Processes with Nonparametric Kernels »
Felipe Tobar · Thang Bui · Richard Turner