Timezone: »
Title: Stochastic Oracles and Where to Find Them
Abstract: Continuous optimization is a mature field, which has recently undergone major expansion and change. One of the key new directions is the development of methods that do not require exact information about the objective function. Nevertheless, the majority of these methods, from stochastic gradient descent to "zero-th order" methods use some kind of approximate first order information. We will introduce a general definition of a stochastic oracle and show how this definition applies in a variety of familiar settings, including simple stochastic gradient via sampling, traditional and randomized finite difference methods, but also more specialized settings, such as robust gradient estimation. We will also overview several stochastic methods and how the general definition extends to the oracles used by these methods.
Author Information
Katya Scheinberg (Cornell)
More from the Same Authors
-
2021 : High Probability Step Size Lower Bound for Adaptive Stochastic Optimization »
Katya Scheinberg · Miaolan Xie -
2022 : Stochastic Adaptive Regularization Method with Cubics: A High Probability Complexity Bound »
Katya Scheinberg · Miaolan Xie -
2023 Workshop: OPT 2023: Optimization for Machine Learning »
Cristóbal Guzmán · Courtney Paquette · Katya Scheinberg · Aaron Sidford · Sebastian Stich -
2021 Workshop: OPT 2021: Optimization for Machine Learning »
Courtney Paquette · Quanquan Gu · Oliver Hinder · Katya Scheinberg · Sebastian Stich · Martin Takac -
2021 Poster: High Probability Complexity Bounds for Line Search Based on Stochastic Oracles »
Billy Jin · Katya Scheinberg · Miaolan Xie