Skip to yearly menu bar Skip to main content


Poster
in
Workshop: NeurIPS 2022 Workshop on Score-Based Methods

Locking and Quacking: Stacking Bayesian models predictions by log-pooling and superposition

Yuling Yao · Luiz Carvalho · Diego Mesquita


Abstract:

Combining predictive distributions is a central problem in Bayesian inference and machine learning. Currently, predictives are almost exclusively combined using linear density-mixtures such as Bayesian model averaging, Bayesian stacking, and mixture of experts.Nonetheless, linear mixtures impose traits that might be undesirable for some applications, such as multi-modality.While there are alternative strategies (e.g., geometric bridge or superposition), optimizing their parameters usually implies computing intractable normalizing constant repeatedly.In this extended abstract, we present two novel Bayesian model combination tools. They are generalizations of \emph{stacking}, but combine posterior densities by log-linear pooling (\emph{locking}) and quantum superposition (\emph{quacking}). To optimize model weights while avoiding the burden of normalizing constants, we maximize the Hyv\"arinen score of the combined posterior predictions. We demonstrate locking and quacking with an illustrative example.

Chat is not available.