Timezone: »

Mirrored Langevin Dynamics
Ya-Ping Hsieh · Ali Kavis · Paul Rolland · Volkan Cevher

Thu Dec 06 07:45 AM -- 09:45 AM (PST) @ Room 210 #43

We consider the problem of sampling from constrained distributions, which has posed significant challenges to both non-asymptotic analysis and algorithmic design. We propose a unified framework, which is inspired by the classical mirror descent, to derive novel first-order sampling schemes. We prove that, for a general target distribution with strongly convex potential, our framework implies the existence of a first-order algorithm achieving O~(\epsilon^{-2}d) convergence, suggesting that the state-of-the-art O~(\epsilon^{-6}d^5) can be vastly improved. With the important Latent Dirichlet Allocation (LDA) application in mind, we specialize our algorithm to sample from Dirichlet posteriors, and derive the first non-asymptotic O~(\epsilon^{-2}d^2) rate for first-order sampling. We further extend our framework to the mini-batch setting and prove convergence rates when only stochastic gradients are available. Finally, we report promising experimental results for LDA on real datasets.

Author Information

Ya-Ping Hsieh (EPFL)
Ali Kavis (EPFL)
Paul Rolland (EPFL)
Volkan Cevher (EPFL)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors