Skip to yearly menu bar Skip to main content


Oral Poster

A Rigorous Link between Deep Ensembles and (Variational) Bayesian Methods

Veit David Wild · Sahra Ghalebikesabi · Dino Sejdinovic · Jeremias Knoblauch

Great Hall & Hall B1+B2 (level 1) #1301
[ ]
[ Paper [ Poster [ OpenReview
Thu 14 Dec 8:45 a.m. PST — 10:45 a.m. PST
 
Oral presentation: Oral 5C Probability/Sampling
Thu 14 Dec 8 a.m. PST — 8:45 a.m. PST

Abstract:

We establish the first mathematically rigorous link between Bayesian, variational Bayesian, and ensemble methods. A key step towards this it to reformulate the non-convex optimisation problem typically encountered in deep learning as a convex optimisation in the space of probability measures. On a technical level, our contribution amounts to studying generalised variational inference through the lense of Wasserstein gradient flows. The result is a unified theory of various seemingly disconnected approaches that are commonly used for uncertainty quantification in deep learning---including deep ensembles and (variational) Bayesian methods. This offers a fresh perspective on the reasons behind the success of deep ensembles over procedures based on parameterised variational inference, and allows the derivation of new ensembling schemes with convergence guarantees. We showcase this by proposing a family of interacting deep ensembles with direct parallels to the interactions of particle systems in thermodynamics, and use our theory to prove the convergence of these algorithms to a well-defined global minimiser on the space of probability measures.

Chat is not available.