Skip to yearly menu bar Skip to main content


Poster

A Stein variational Newton method

Gianluca Detommaso · Tiangang Cui · Youssef Marzouk · Alessio Spantini · Robert Scheichl

Room 210 #31

Keywords: [ Efficient Inference Methods ] [ Variational Inference ]


Abstract:

Stein variational gradient descent (SVGD) was recently proposed as a general purpose nonparametric variational inference algorithm: it minimizes the Kullback–Leibler divergence between the target distribution and its approximation by implementing a form of functional gradient descent on a reproducing kernel Hilbert space [Liu & Wang, NIPS 2016]. In this paper, we accelerate and generalize the SVGD algorithm by including second-order information, thereby approximating a Newton-like iteration in function space. We also show how second-order information can lead to more effective choices of kernel. We observe significant computational gains over the original SVGD algorithm in multiple test cases.

Live content is unavailable. Log in and register to view live content