Timezone: »
The randomized midpoint method, proposed by (Shen and Lee, 2019), has emerged as an optimal discretization procedure for simulating the continuous time underdamped Langevin diffusion. In this paper, we analyze several probabilistic properties of the randomized midpoint discretization method, considering both overdamped and underdamped Langevin dynamics. We first characterize the stationary distribution of the discrete chain obtained with constant step-size discretization and show that it is biased away from the target distribution. Notably, the step-size needs to go to zero to obtain asymptotic unbiasedness. Next, we establish the asymptotic normality of numerical integration using the randomized midpoint method and highlight the relative advantages and disadvantages over other discretizations. Our results collectively provide several insights into the behavior of the randomized midpoint discretization method, including obtaining confidence intervals for numerical integrations.
Author Information
Ye He (University of California, Davis)
Krishnakumar Balasubramanian (University of California, Davis)
Murat Erdogdu (University of Toronto)
More from the Same Authors
-
2022 Poster: A Projection-free Algorithm for Constrained Stochastic Multi-level Composition Optimization »
Tesi Xiao · Krishnakumar Balasubramanian · Saeed Ghadimi -
2022 Poster: Constrained Stochastic Nonconvex Optimization with State-dependent Markov Data »
Abhishek Roy · Krishnakumar Balasubramanian · Saeed Ghadimi -
2021 Poster: An Analysis of Constant Step Size SGD in the Non-convex Regime: Asymptotic Normality and Bias »
Lu Yu · Krishnakumar Balasubramanian · Stanislav Volgushev · Murat Erdogdu -
2021 Poster: On Empirical Risk Minimization with Dependent and Heavy-Tailed Data »
Abhishek Roy · Krishnakumar Balasubramanian · Murat Erdogdu -
2020 Poster: Escaping Saddle-Point Faster under Interpolation-like Conditions »
Abhishek Roy · Krishnakumar Balasubramanian · Saeed Ghadimi · Prasant Mohapatra -
2018 Poster: Zeroth-order (Non)-Convex Stochastic Optimization via Conditional Gradient and Gradient Updates »
Krishnakumar Balasubramanian · Saeed Ghadimi -
2018 Poster: Global Non-convex Optimization with Discretized Diffusions »
Murat Erdogdu · Lester Mackey · Ohad Shamir -
2017 Poster: Robust Estimation of Neural Signals in Calcium Imaging »
Hakan Inan · Murat Erdogdu · Mark Schnitzer -
2017 Poster: Estimating High-dimensional Non-Gaussian Multiple Index Models via Stein’s Lemma »
Zhuoran Yang · Krishnakumar Balasubramanian · Zhaoran Wang · Han Liu -
2017 Poster: Inference in Graphical Models via Semidefinite Programming Hierarchies »
Murat Erdogdu · Yash Deshpande · Andrea Montanari -
2016 Poster: Scaled Least Squares Estimator for GLMs in Large-Scale Problems »
Murat Erdogdu · Lee H Dicker · Mohsen Bayati -
2015 Poster: Convergence rates of sub-sampled Newton methods »
Murat Erdogdu · Andrea Montanari -
2015 Poster: Newton-Stein Method: A Second Order Method for GLMs via Stein's Lemma »
Murat Erdogdu -
2015 Spotlight: Newton-Stein Method: A Second Order Method for GLMs via Stein's Lemma »
Murat Erdogdu -
2013 Poster: Estimating LASSO Risk and Noise Level »
Mohsen Bayati · Murat Erdogdu · Andrea Montanari