Timezone: »
Spotlight
The Randomized Midpoint Method for Log-Concave Sampling
Ruoqi Shen · Yin Tat Lee
Sampling from log-concave distributions is a well researched problem
that has many applications in statistics and machine learning. We
study the distributions of the form $p^{*}\propto\exp(-f(x))$, where
$f:\mathbb{R}^{d}\rightarrow\mathbb{R}$ has an $L$-Lipschitz gradient
and is $m$-strongly convex. In our paper, we propose a Markov chain
Monte Carlo (MCMC) algorithm based on the underdamped Langevin diffusion
(ULD). It can achieve $\epsilon\cdot D$ error (in 2-Wasserstein distance)
in $\tilde{O}\left(\kappa^{7/6}/\epsilon^{1/3}+\kappa/\epsilon^{2/3}\right)$
steps, where $D\overset{\mathrm{def}}{=}\sqrt{\frac{d}{m}}$ is the effective diameter
of the problem and $\kappa\overset{\mathrm{def}}{=}\frac{L}{m}$ is the condition number. Our algorithm performs significantly faster than the previously best
known algorithm for solving this problem, which requires $\tilde{O}\left(\kappa^{1.5}/\epsilon\right)$
steps \cite{chen2019optimal,dalalyan2018sampling}. Moreover, our
algorithm can be easily parallelized to require only $O(\kappa\log\frac{1}{\epsilon})$
parallel steps.
To solve the sampling problem, we propose a new framework to discretize
stochastic differential equations. We apply this framework to discretize
and simulate ULD, which converges to the target distribution $p^{*}$.
The framework can be used to solve not only the log-concave sampling
problem, but any problem that involves simulating (stochastic) differential
equations.
Author Information
Ruoqi Shen (University of Washington)
Yin Tat Lee (UW)
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Poster: The Randomized Midpoint Method for Log-Concave Sampling »
Wed. Dec 11th 01:30 -- 03:30 AM Room East Exhibition Hall B + C #163
More from the Same Authors
-
2021 Spotlight: Numerical Composition of Differential Privacy »
Sivakanth Gopi · Yin Tat Lee · Lukas Wutschitz -
2021 Spotlight: Private Non-smooth ERM and SCO in Subquadratic Steps »
Janardhan Kulkarni · Yin Tat Lee · Daogao Liu -
2022 Poster: Sampling with Riemannian Hamiltonian Monte Carlo in a Constrained Space »
Yunbum Kook · Yin-Tat Lee · Ruoqi Shen · Santosh Vempala -
2022 Poster: A gradient sampling method with complexity guarantees for Lipschitz functions in high and low dimensions »
Damek Davis · Dmitriy Drusvyatskiy · Yin Tat Lee · Swati Padmanabhan · Guanghao Ye -
2022 Poster: Decomposable Non-Smooth Convex Optimization with Nearly-Linear Gradient Oracle Complexity »
Sally Dong · Haotian Jiang · Yin Tat Lee · Swati Padmanabhan · Guanghao Ye -
2022 Poster: Near-Optimal Randomized Exploration for Tabular Markov Decision Processes »
Zhihan Xiong · Ruoqi Shen · Qiwen Cui · Maryam Fazel · Simon Du -
2021 Poster: Private Non-smooth ERM and SCO in Subquadratic Steps »
Janardhan Kulkarni · Yin Tat Lee · Daogao Liu -
2021 Poster: Lower Bounds on Metropolized Sampling Methods for Well-Conditioned Distributions »
Yin Tat Lee · Ruoqi Shen · Kevin Tian -
2021 Poster: Fast and Memory Efficient Differentially Private-SGD via JL Projections »
Zhiqi Bu · Sivakanth Gopi · Janardhan Kulkarni · Yin Tat Lee · Judy Hanwen Shen · Uthaipon Tantipongpipat -
2021 Poster: Numerical Composition of Differential Privacy »
Sivakanth Gopi · Yin Tat Lee · Lukas Wutschitz -
2021 Oral: Lower Bounds on Metropolized Sampling Methods for Well-Conditioned Distributions »
Yin Tat Lee · Ruoqi Shen · Kevin Tian -
2020 Poster: Generalized Leverage Score Sampling for Neural Networks »
Jason Lee · Ruoqi Shen · Zhao Song · Mengdi Wang · zheng Yu -
2020 Poster: Acceleration with a Ball Optimization Oracle »
Yair Carmon · Arun Jambulapati · Qijia Jiang · Yujia Jin · Yin Tat Lee · Aaron Sidford · Kevin Tian -
2020 Oral: Acceleration with a Ball Optimization Oracle »
Yair Carmon · Arun Jambulapati · Qijia Jiang · Yujia Jin · Yin Tat Lee · Aaron Sidford · Kevin Tian -
2020 Poster: Network size and size of the weights in memorization with two-layers neural networks »
Sebastien Bubeck · Ronen Eldan · Yin Tat Lee · Dan Mikulincer -
2018 Poster: Optimal Algorithms for Non-Smooth Distributed Optimization in Networks »
Kevin Scaman · Francis Bach · Sebastien Bubeck · Laurent Massoulié · Yin Tat Lee -
2018 Oral: Optimal Algorithms for Non-Smooth Distributed Optimization in Networks »
Kevin Scaman · Francis Bach · Sebastien Bubeck · Laurent Massoulié · Yin Tat Lee