Timezone: »
Gibbs sampling is a Markov chain Monte Carlo method that is often used for learning and inference on graphical models. Minibatching, in which a small random subset of the graph is used at each iteration, can help make Gibbs sampling scale to large graphical models by reducing its computational cost. In this paper, we propose a new auxiliary-variable minibatched Gibbs sampling method, {\it Poisson-minibatching Gibbs}, which both produces unbiased samples and has a theoretical guarantee on its convergence rate. In comparison to previous minibatched Gibbs algorithms, Poisson-minibatching Gibbs supports fast sampling from continuous state spaces and avoids the need for a Metropolis-Hastings correction on discrete state spaces. We demonstrate the effectiveness of our method on multiple applications and in comparison with both plain Gibbs and previous minibatched methods.
Author Information
Ruqi Zhang (Cornell University)
Christopher De Sa (Cornell)
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Spotlight: Poisson-Minibatching for Gibbs Sampling with Convergence Rate Guarantees »
Wed. Dec 11th 01:15 -- 01:20 AM Room West Ballroom C
More from the Same Authors
-
2021 Poster: Representing Hyperbolic Space Accurately using Multi-Component Floats »
Tao Yu · Christopher De Sa -
2021 Poster: Hyperparameter Optimization Is Deceiving Us, and How to Stop It »
A. Feder Cooper · Yucheng Lu · Jessica Forde · Christopher De Sa -
2021 Poster: Equivariant Manifold Flows »
Isay Katsman · Aaron Lou · Derek Lim · Qingxuan Jiang · Ser Nam Lim · Christopher De Sa -
2020 Workshop: Differential Geometry meets Deep Learning (DiffGeo4DL) »
Joey Bose · Emile Mathieu · Charline Le Lan · Ines Chami · Frederic Sala · Christopher De Sa · Maximilian Nickel · Christopher RĂ© · Will Hamilton -
2020 Poster: Random Reshuffling is Not Always Better »
Christopher De Sa -
2020 Poster: Asymptotically Optimal Exact Minibatch Metropolis-Hastings »
Ruqi Zhang · A. Feder Cooper · Christopher De Sa -
2020 Spotlight: Asymptotically Optimal Exact Minibatch Metropolis-Hastings »
Ruqi Zhang · A. Feder Cooper · Christopher De Sa -
2020 Spotlight: Random Reshuffling is Not Always Better »
Christopher De Sa -
2020 Poster: Neural Manifold Ordinary Differential Equations »
Aaron Lou · Derek Lim · Isay Katsman · Leo Huang · Qingxuan Jiang · Ser Nam Lim · Christopher De Sa -
2019 Poster: Numerically Accurate Hyperbolic Embeddings Using Tiling-Based Models »
Tao Yu · Christopher De Sa -
2019 Spotlight: Numerically Accurate Hyperbolic Embeddings Using Tiling-Based Models »
Tao Yu · Christopher De Sa -
2019 Poster: Dimension-Free Bounds for Low-Precision Training »
Zheng Li · Christopher De Sa -
2019 Poster: Channel Gating Neural Networks »
Weizhe Hua · Yuan Zhou · Christopher De Sa · Zhiru Zhang · G. Edward Suh