Timezone: »

Spectral Mixture Kernels for Multi-Output Gaussian Processes
Gabriel Parra · Felipe Tobar

Tue Dec 05 06:30 PM -- 10:30 PM (PST) @ Pacific Ballroom #197 #None

Early approaches to multiple-output Gaussian processes (MOGPs) relied on linear combinations of independent, latent, single-output Gaussian processes (GPs). This resulted in cross-covariance functions with limited parametric interpretation, thus conflicting with the ability of single-output GPs to understand lengthscales, frequencies and magnitudes to name a few. On the contrary, current approaches to MOGP are able to better interpret the relationship between different channels by directly modelling the cross-covariances as a spectral mixture kernel with a phase shift. We extend this rationale and propose a parametric family of complex-valued cross-spectral densities and then build on Cramér's Theorem (the multivariate version of Bochner's Theorem) to provide a principled approach to design multivariate covariance functions. The so-constructed kernels are able to model delays among channels in addition to phase differences and are thus more expressive than previous methods, while also providing full parametric interpretation of the relationship across channels. The proposed method is first validated on synthetic data and then compared to existing MOGP methods on two real-world examples.

Author Information

Gabriel Parra (Universidad de Chile)
Felipe Tobar (Universidad de Chile)

Felipe Tobar is an Assistant Professor at the Data & AI Initiative at Universidad de Chile. He holds Researcher positions at the Center for Mathematical Modeling and the Advanced Center for Electrical Engineering. Felipe received the BSc/MSc degrees in Electrical Engineering (U. de Chile, 2010) and a PhD in Signal Processing (Imperial College London, 2014), and he was an Associate Researcher in Machine Learning at the University of Cambridge (2014-2015). Felipe teaches Statistics and Machine Learning courses at undergraduate, graduate and professional levels. His research interests lie in the interface between Machine Learning and Statistical Signal Processing, including Gaussian processes, spectral estimation, approximate inference, Bayesian nonparametrics, and optimal transport.

More from the Same Authors