Skip to yearly menu bar Skip to main content


Poster

Calibration by Distribution Matching: Trainable Kernel Calibration Metrics

Charlie Marx · Sofian Zalouk · Stefano Ermon

Great Hall & Hall B1+B2 (level 1) #1211
[ ]
Wed 13 Dec 8:45 a.m. PST — 10:45 a.m. PST

Abstract:

Calibration ensures that probabilistic forecasts meaningfully capture uncertainty by requiring that predicted probabilities align with empirical frequencies. However, many existing calibration methods are specialized for post-hoc recalibration, which can worsen the sharpness of forecasts. Drawing on the insight that calibration can be viewed as a distribution matching task, we introduce kernel-based calibration metrics that unify and generalize popular forms of calibration for both classification and regression. These metrics admit differentiable sample estimates, making it easy to incorporate a calibration objective into empirical risk minimization. Furthermore, we provide intuitive mechanisms to tailor calibration metrics to a decision task, and enforce accurate loss estimation and no regret decisions. Our empirical evaluation demonstrates that employing these metrics as regularizers enhances calibration, sharpness, and decision-making across a range of regression and classification tasks, outperforming methods relying solely on post-hoc recalibration.

Chat is not available.