Skip to yearly menu bar Skip to main content


Poster

Bayesian Metric Learning for Uncertainty Quantification in Image Retrieval

Frederik Warburg · Marco Miani · Silas Brack · Søren Hauberg

Great Hall & Hall B1+B2 (level 1) #1200

Abstract:

We propose a Bayesian encoder for metric learning. Rather than relying on neural amortization as done in prior works, we learn a distribution over the network weights with the Laplace Approximation. We first prove that the contrastive loss is a negative log-likelihood on the spherical space. We propose three methods that ensure a positive definite covariance matrix. Lastly, we present a novel decomposition of the Generalized Gauss-Newton approximation. Empirically, we show that our Laplacian Metric Learner (LAM) yields well-calibrated uncertainties, reliably detects out-of-distribution examples, and has state-of-the-art predictive performance.

Chat is not available.