Timezone: »

 
Poster
Collapsed Variational Bounds for Bayesian Neural Networks
Marcin Tomczak · Siddharth Swaroop · Andrew Foong · Richard Turner

Wed Dec 08 04:30 PM -- 06:00 PM (PST) @

Recent interest in learning large variational Bayesian Neural Networks (BNNs) has been partly hampered by poor predictive performance caused by underfitting, and their performance is known to be very sensitive to the prior over weights. Current practice often fixes the prior parameters to standard values or tunes them using heuristics or cross-validation. In this paper, we treat prior parameters in a distributional way by extending the model and collapsing the variational bound with respect to their posteriors. This leads to novel and tighter Evidence Lower Bounds (ELBOs) for performing variational inference (VI) in BNNs. Our experiments show that the new bounds significantly improve the performance of Gaussian mean-field VI applied to BNNs on a variety of data sets, demonstrating that mean-field VI works well even in deep models. We also find that the tighter ELBOs can be good optimization targets for learning the hyperparameters of hierarchical priors.

Author Information

Marcin Tomczak (University of Cambridge)
Siddharth Swaroop (University of Cambridge)
Andrew Foong (University of Cambridge)

PhD student from Malaysia working on Bayesian deep learning at the Computational and Biological Learning lab at the Cambridge University Engineering Department.

Richard Turner (University of Cambridge)

More from the Same Authors