Timezone: »

 
Poster
Variational Dropout and the Local Reparameterization Trick
Diederik Kingma · Tim Salimans · Max Welling

Mon Dec 07 04:00 PM -- 08:59 PM (PST) @ 210 C #34

We explore an as yet unexploited opportunity for drastically improving the efficiency of stochastic gradient variational Bayes (SGVB) with global model parameters. Regular SGVB estimators rely on sampling of parameters once per minibatch of data, and have variance that is constant w.r.t. the minibatch size. The efficiency of such estimators can be drastically improved upon by translating uncertainty about global parameters into local noise that is independent across datapoints in the minibatch. Such reparameterizations with local noise can be trivially parallelized and have variance that is inversely proportional to the minibatch size, generally leading to much faster convergence.We find an important connection with regularization by dropout: the original Gaussian dropout objective corresponds to SGVB with local noise, a scale-invariant prior and proportionally fixed posterior variance. Our method allows inference of more flexibly parameterized posteriors; specifically, we propose \emph{variational dropout}, a generalization of Gaussian dropout, but with a more flexibly parameterized posterior, often leading to better generalization. The method is demonstrated through several experiments.

Author Information

Diederik Kingma (U. Amsterdam)
Tim Salimans (Algoritmica)
Max Welling (University of Amsterdam)

More from the Same Authors