Timezone: »
Poster
Never Go Full Batch (in Stochastic Convex Optimization)
Idan Amir · Yair Carmon · Tomer Koren · Roi Livni
We study the generalization performance of $\text{\emph{full-batch}}$ optimization algorithms for stochastic convex optimization: these are first-order methods that only access the exact gradient of the empirical risk (rather than gradients with respect to individual data points), that include a wide range of algorithms such as gradient descent, mirror descent, and their regularized and/or accelerated variants. We provide a new separation result showing that, while algorithms such as stochastic gradient descent can generalize and optimize the population risk to within $\epsilon$ after $O(1/\epsilon^2)$ iterations, full-batch methods either need at least $\Omega(1/\epsilon^4)$ iterations or exhibit a dimension-dependent sample complexity.
Author Information
Idan Amir (Tel-Aviv University)
Yair Carmon (Stanford University)
Tomer Koren (Tel Aviv University & Google)
Roi Livni (Tel Aviv University)
More from the Same Authors
-
2021 Spotlight: Littlestone Classes are Privately Online Learnable »
Noah Golowich · Roi Livni -
2022 Poster: Thinking Outside the Ball: Optimal Learning with Gradient Descent for Generalized Linear Stochastic Convex Optimization »
Idan Amir · Roi Livni · Nati Srebro -
2022 Poster: Better Best of Both Worlds Bounds for Bandits with Switching Costs »
Idan Amir · Guy Azov · Tomer Koren · Roi Livni -
2021 Poster: Algorithmic Instabilities of Accelerated Gradient Descent »
Amit Attia · Tomer Koren -
2021 Oral: Optimal Rates for Random Order Online Optimization »
Uri Sherman · Tomer Koren · Yishay Mansour -
2021 Poster: Towards Best-of-All-Worlds Online Learning with Feedback Graphs »
Liad Erez · Tomer Koren -
2021 Poster: Stochastic Bias-Reduced Gradient Methods »
Hilal Asi · Yair Carmon · Arun Jambulapati · Yujia Jin · Aaron Sidford -
2021 Poster: Littlestone Classes are Privately Online Learnable »
Noah Golowich · Roi Livni -
2021 Poster: Optimal Rates for Random Order Online Optimization »
Uri Sherman · Tomer Koren · Yishay Mansour -
2021 Poster: Asynchronous Stochastic Optimization Robust to Arbitrary Delays »
Alon Cohen · Amit Daniely · Yoel Drori · Tomer Koren · Mariano Schain -
2020 Poster: Can Implicit Bias Explain Generalization? Stochastic Convex Optimization as a Case Study »
Assaf Dauber · Meir Feder · Tomer Koren · Roi Livni -
2020 Poster: Prediction with Corrupted Expert Advice »
Idan Amir · Idan Attias · Tomer Koren · Yishay Mansour · Roi Livni -
2020 Spotlight: Prediction with Corrupted Expert Advice »
Idan Amir · Idan Attias · Tomer Koren · Yishay Mansour · Roi Livni -
2019 : Private Stochastic Convex Optimization: Optimal Rates in Linear Time »
Vitaly Feldman · Tomer Koren · Kunal Talwar -
2019 Poster: Memory Efficient Adaptive Optimization »
Rohan Anil · Vineet Gupta · Tomer Koren · Yoram Singer -
2019 Poster: Robust Bi-Tempered Logistic Loss Based on Bregman Divergences »
Ehsan Amid · Manfred K. Warmuth · Rohan Anil · Tomer Koren -
2017 Poster: Affine-Invariant Online Optimization and the Low-rank Experts Problem »
Tomer Koren · Roi Livni -
2017 Poster: Multi-Armed Bandits with Metric Movement Costs »
Tomer Koren · Roi Livni · Yishay Mansour