Skip to yearly menu bar Skip to main content


Poster
in
Workshop: OPT 2022: Optimization for Machine Learning

A Finite-Particle Convergence Rate for Stein Variational Gradient Descent

Jiaxin Shi · Lester Mackey


Abstract: We provide a first finite-particle convergence rate for Stein variational gradient descent (SVGD). Specifically, with certain choices of step size sequences, SVGD with n particles drive the kernel Stein discrepancy to zero at the rate $O\left(\frac{1}{\sqrt{\log\log n}} \right)$. We suspect that the dependence on $n$ can be improved, and we hope that our explicit, non-asymptotic proof strategy will serve as a template for future refinements.

Chat is not available.