Timezone: »
Many learning algorithms, such as stochastic gradient descent, are affected by the order in which training examples are used. It is often observed that sampling the training examples without-replacement, also known as random reshuffling, causes learning algorithms to converge faster. We give a counterexample to the Operator Inequality of Noncommutative Arithmetic and Geometric Means, a longstanding conjecture that relates to the performance of random reshuffling in learning algorithms (Recht and Ré, "Toward a noncommutative arithmetic-geometric mean inequality: conjectures, case-studies, and consequences," COLT 2012). We use this to give an example of a learning task and algorithm for which with-replacement random sampling actually outperforms random reshuffling.
Author Information
Christopher De Sa (Cornell)
Related Events (a corresponding poster, oral, or spotlight)
-
2020 Spotlight: Random Reshuffling is Not Always Better »
Thu. Dec 10th 03:30 -- 03:40 PM Room Orals & Spotlights: Optimization/Theory
More from the Same Authors
-
2021 Poster: Representing Hyperbolic Space Accurately using Multi-Component Floats »
Tao Yu · Christopher De Sa -
2021 Poster: Hyperparameter Optimization Is Deceiving Us, and How to Stop It »
A. Feder Cooper · Yucheng Lu · Jessica Forde · Christopher De Sa -
2021 Poster: Equivariant Manifold Flows »
Isay Katsman · Aaron Lou · Derek Lim · Qingxuan Jiang · Ser Nam Lim · Christopher De Sa -
2020 Workshop: Differential Geometry meets Deep Learning (DiffGeo4DL) »
Joey Bose · Emile Mathieu · Charline Le Lan · Ines Chami · Frederic Sala · Christopher De Sa · Maximilian Nickel · Christopher Ré · Will Hamilton -
2020 Poster: Asymptotically Optimal Exact Minibatch Metropolis-Hastings »
Ruqi Zhang · A. Feder Cooper · Christopher De Sa -
2020 Spotlight: Asymptotically Optimal Exact Minibatch Metropolis-Hastings »
Ruqi Zhang · A. Feder Cooper · Christopher De Sa -
2020 Poster: Neural Manifold Ordinary Differential Equations »
Aaron Lou · Derek Lim · Isay Katsman · Leo Huang · Qingxuan Jiang · Ser Nam Lim · Christopher De Sa -
2019 Poster: Numerically Accurate Hyperbolic Embeddings Using Tiling-Based Models »
Tao Yu · Christopher De Sa -
2019 Spotlight: Numerically Accurate Hyperbolic Embeddings Using Tiling-Based Models »
Tao Yu · Christopher De Sa -
2019 Poster: Dimension-Free Bounds for Low-Precision Training »
Zheng Li · Christopher De Sa -
2019 Poster: Poisson-Minibatching for Gibbs Sampling with Convergence Rate Guarantees »
Ruqi Zhang · Christopher De Sa -
2019 Spotlight: Poisson-Minibatching for Gibbs Sampling with Convergence Rate Guarantees »
Ruqi Zhang · Christopher De Sa -
2019 Poster: Channel Gating Neural Networks »
Weizhe Hua · Yuan Zhou · Christopher De Sa · Zhiru Zhang · G. Edward Suh