Timezone: »
Several classical adaptive optimization algorithms, such as line search and trust-region methods, have been recently extended to stochastic settings. Unlike the stochastic gradient method and its many variants, these algorithms do not use a pre-specified sequence of step sizes, but increase or decrease the step size adaptively according to the estimated progress of the algorithm. These algorithms rely on stochastic oracles that estimate function values, gradients, and Hessians in some cases. The accuracy requirement of these oracles is also adaptive and depends on the step size. In the deterministic setting, a lower bound on the step size is easily derived, however, in the stochastic setting, due to possible oracle failures, bounds on the step size have not been previously derived. In this paper, we give a lower bound on the step size that holds with high probability. This bound is dependent on the probability of the oracle failures, recovering the deterministic result as an extreme case when this probability is zero.
Author Information
Katya Scheinberg (Cornell)
Miaolan Xie (Cornell University)
More from the Same Authors
-
2022 : Stochastic Adaptive Regularization Method with Cubics: A High Probability Complexity Bound »
Katya Scheinberg · Miaolan Xie -
2023 Workshop: OPT 2023: Optimization for Machine Learning »
Cristóbal Guzmán · Courtney Paquette · Katya Scheinberg · Aaron Sidford · Sebastian Stich -
2022 : Katya Scheinberg, Stochastic Oracles and Where to Find Them »
Katya Scheinberg -
2021 Workshop: OPT 2021: Optimization for Machine Learning »
Courtney Paquette · Quanquan Gu · Oliver Hinder · Katya Scheinberg · Sebastian Stich · Martin Takac -
2021 Poster: High Probability Complexity Bounds for Line Search Based on Stochastic Oracles »
Billy Jin · Katya Scheinberg · Miaolan Xie