Timezone: »
Gaussian Processes (GPs) are popular surrogate models for sequential decision making tasks such as Bayesian Optimization and Active Learning. Such frameworks often exploit well-known cheap methods for conditioning a GP posterior on new data. However, these standard methods cannot be applied to popular but more complex models such as sparse GPs or for non-conjugate likelihoods due to a lack of such update formulas. Using an alternative sparse Dual GP parameterization, we show that these costly computations can be avoided, whilst enjoying one-step updates for non-Gaussian likelihoods. The resulting algorithms allow for cheap batch formulations that work with most acquisition functions.
Author Information
Paul Chang (Aalto University)
A machine learning researcher working in the Arno Solin group at Aalto University. Looking at probabilistic modelling specifically Gaussian Processes and methods to speed up inference.
Prakhar Verma (Aalto University)
ST John (Aalto University & Finnish Center for Artificial Intelligence)
Victor Picheny (Prowler)
Henry Moss (Secondmind)
I am a Senior Machine Learning Researcher at Secondmind (formerly PROWLER.io). I leverage information-theoretic arguments to provide efficient, reliable and scalable Bayesian optimisation for problems inspired by science and the automotive industry.
Arno Solin (Aalto University)
More from the Same Authors
-
2022 : Towards Improved Learning in Gaussian Processes: The Best of Two Worlds »
Rui Li · ST John · Arno Solin -
2022 : GAUCHE: A Library for Gaussian Processes in Chemistry »
Ryan-Rhys Griffiths · Leo Klarner · Henry Moss · Aditya Ravuri · Sang Truong · Bojana Rankovic · Yuanqi Du · Arian Jamasb · Julius Schwartz · Austin Tripp · Gregory Kell · Anthony Bourached · Alex Chan · Jacob Moss · Chengzhi Guo · Alpha Lee · Philippe Schwaller · Jian Tang -
2022 : Targeted Causal Elicitation »
Nazaal Ibrahim · ST John · Zhigao Guo · Samuel Kaski -
2022 : Joint Point Process Model for Counterfactual Treatment-Outcome Trajectories Under Policy Interventions »
Çağlar Hızlı · ST John · Anne Juuti · Tuure Saarinen · Kirsi Pietiläinen · Pekka Marttinen -
2021 : Sparse Gaussian Processes for Stochastic Differential Equations »
Prakhar Verma · Vincent ADAM · Arno Solin -
2021 Poster: Dual Parameterization of Sparse Variational Gaussian Processes »
Vincent ADAM · Paul Chang · Mohammad Emtiyaz Khan · Arno Solin -
2021 Poster: Periodic Activation Functions Induce Stationarity »
Lassi Meronen · Martin Trapp · Arno Solin -
2021 Poster: Scalable Thompson Sampling using Sparse Gaussian Process Models »
Sattar Vakili · Henry Moss · Artem Artemev · Vincent Dutordoir · Victor Picheny -
2021 Poster: Spatio-Temporal Variational Gaussian Processes »
Oliver Hamelijnck · William Wilkinson · Niki Loppi · Arno Solin · Theodoros Damoulas -
2021 Poster: Scalable Inference in SDEs by Direct Matching of the Fokker–Planck–Kolmogorov Equation »
Arno Solin · Ella Tamir · Prakhar Verma -
2020 Poster: Stationary Activations for Uncertainty Calibration in Deep Learning »
Lassi Meronen · Christabella Irwanto · Arno Solin -
2020 Poster: Deep Automodulators »
Ari Heljakka · Yuxin Hou · Juho Kannala · Arno Solin -
2018 Poster: Infinite-Horizon Gaussian Processes »
Arno Solin · James Hensman · Richard Turner