Skip to yearly menu bar Skip to main content


Poster
in
Workshop: I Can’t Believe It’s Not Better: Understanding Deep Learning Through Empirical Falsification

Spike-and-Slab Probabilistic Backpropagation: When Smarter Approximations Make No Difference

Evan Ott · Sinead Williamson


Abstract:

Probabilistic backpropagation is an approximate Bayesian inference method for deep neural networks, using a message-passing framework. These messages---which correspond to distributions arising as we propagate our input through a probabilistic neural network---are approximated as Gaussian. However, in practice, the exact distributions may be highly non-Gaussian. In this paper, we propose a more realistic approximation based on a spike-and-slab distribution. Unfortunately, in this case, better approximation of the messages does not translate to better downstream performance. We present results comparing the two schemes and discuss why we do not see a benefit from this spike-and-slab approach.

Chat is not available.