Oral
in
Workshop: Optimal Transport and Machine Learning
Provably Fast Finite Particle Variants of SVGD via Virtual Particle Stochastic Approximation
Aniket Das · Dheeraj Nagaraj
[
Abstract
]
[ Project Page ]
presentation:
Optimal Transport and Machine Learning
Sat 16 Dec 6:30 a.m. PST — 3:30 p.m. PST
[
OpenReview]
Sat 16 Dec 1 p.m. PST
— 1:15 p.m. PST
Sat 16 Dec 6:30 a.m. PST — 3:30 p.m. PST
Abstract:
SVGD is a popular particle-based variational inference algorithm with well studied mean-field dynamics. However, its finite-particle behavior is far less understood. Our work introduces the notion of *virtual particles* to develop novel stochastic approximations of mean-field SVGD dynamics in the space of probability measures, that are exactly realizable using finite particles. As a result, we design two computationally efficient variants of SVGD (VP-SVGD and GB-SVGD) with provably fast finite-particle convergence rates. Our algorithms are specific random-batch approximations of SVGD which are computationally more efficient than ordinary SVGD. We show that the $n$ output particles of VP-SVGD and GB-SVGD, run for $T$ steps with batchsize $K$, are as good as i.i.d samples from a measure whose Kernel Stein Discrepancy to the target is at most $O(\tfrac{d^{1/3}}{(KT)^{1/6}})$ under standard assumptions. We prove similar results under a mild growth condition on the score function, which is weaker than the assumptions of prior works. Our convergence rates for the empirical measure (of the particles output by VP-SVGD and GB-SVGD) to the target distribution enjoys a **double exponential improvement** over the best known finite-particle analysis of SVGD. Furthermore, our results give the **first known polynomial oracle complexity in dimension**, completely eliminating the curse of dimensionality exhibited by previously known finite-particle rates.
Chat is not available.