Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Bayesian Deep Learning

Federated Functional Variational Inference

Michael Hutchinson · Matthias Reisser · Christos Louizos


Abstract:

Traditional federated learning (FL) involves optimizing point estimates for the parameters of the server model via a maximum likelihood objective. While models trained with such objectives show competitive predictive accuracy, they are poorly calibrated and provide no reliable uncertainty estimates. Well calibrated uncertainty is, however, important in safety critical applications of FL such as self-driving cars and healthcare. In this work we propose several methods to train Bayesian neural networks, networks providing uncertainty over their model parameters, in FL. We introduce baseline methods that employ priors in and do inference on the weight-space of the network. We also propose two function-space inference methods. These build upon recent work in functional variational inference to posit prior distributions in and do inference on the function-space of the network. These two approaches are based on Federated Averaging (FedAvg) and Expectation-Maximization (EM). We compare these function-space methods to their weight-space counterparts.

Chat is not available.