Timezone: »
We consider a line-search method for continuous optimization under a stochastic setting where the function values and gradients are available only through inexact probabilistic zeroth and first-order oracles. These oracles capture multiple standard settings including expected loss minimization and zeroth-order optimization. Moreover, our framework is very general and allows the function and gradient estimates to be biased. The proposed algorithm is simple to describe, easy to implement, and uses these oracles in a similar way as the standard deterministic line search uses exact function and gradient values. Under fairly general conditions on the oracles, we derive a high probability tail bound on the iteration complexity of the algorithm when applied to non-convex smooth functions. These results are stronger than those for other existing stochastic line search methods and apply in more general settings.
Author Information
Billy Jin (Cornell University)
Katya Scheinberg (Cornell)
Miaolan Xie (Cornell University)
More from the Same Authors
-
2021 : High Probability Step Size Lower Bound for Adaptive Stochastic Optimization »
Katya Scheinberg · Miaolan Xie -
2022 : Stochastic Adaptive Regularization Method with Cubics: A High Probability Complexity Bound »
Katya Scheinberg · Miaolan Xie -
2023 Workshop: OPT 2023: Optimization for Machine Learning »
Cristóbal Guzmán · Courtney Paquette · Katya Scheinberg · Aaron Sidford · Sebastian Stich -
2022 : Katya Scheinberg, Stochastic Oracles and Where to Find Them »
Katya Scheinberg -
2022 Poster: Online Bipartite Matching with Advice: Tight Robustness-Consistency Tradeoffs for the Two-Stage Model »
Billy Jin · Will Ma -
2021 Workshop: OPT 2021: Optimization for Machine Learning »
Courtney Paquette · Quanquan Gu · Oliver Hinder · Katya Scheinberg · Sebastian Stich · Martin Takac