Timezone: »

 
Poster
Fast Second Order Stochastic Backpropagation for Variational Inference
Kai Fan · Ziteng Wang · Jeff Beck · James Kwok · Katherine Heller

Mon Dec 07 04:00 PM -- 08:59 PM (PST) @ 210 C #38 #None

We propose a second-order (Hessian or Hessian-free) based optimization method for variational inference inspired by Gaussian backpropagation, and argue that quasi-Newton optimization can be developed as well. This is accomplished by generalizing the gradient computation in stochastic backpropagation via a reparametrization trick with lower complexity. As an illustrative example, we apply this approach to the problems of Bayesian logistic regression and variational auto-encoder (VAE). Additionally, we compute bounds on the estimator variance of intractable expectations for the family of Lipschitz continuous function. Our method is practical, scalable and model free. We demonstrate our method on several real-world datasets and provide comparisons with other stochastic gradient methods to show substantial enhancement in convergence rates.

Author Information

Kai Fan (Duke University)
Ziteng Wang
Jeff Beck
James Kwok (Hong Kong University of Science and Technology)
Katherine Heller (Duke)

More from the Same Authors