Acceleration with a Ball Optimization Oracle
Yair Carmon, Arun Jambulapati, Qijia Jiang, Yujia Jin, Yin Tat Lee, Aaron Sidford, Kevin Tian
Oral presentation: Orals & Spotlights Track 21: Optimization
on Wed, Dec 9th, 2020 @ 14:30 – 14:45 GMT
on Wed, Dec 9th, 2020 @ 14:30 – 14:45 GMT
Poster Session 4 (more posters)
on Wed, Dec 9th, 2020 @ 17:00 – 19:00 GMT
GatherTown: Optimization ( Town E3 - Spot C1 )
on Wed, Dec 9th, 2020 @ 17:00 – 19:00 GMT
GatherTown: Optimization ( Town E3 - Spot C1 )
Join GatherTown
Only iff poster is crowded, join Zoom . Authors have to start the Zoom call from their Profile page / Presentation History.
Only iff poster is crowded, join Zoom . Authors have to start the Zoom call from their Profile page / Presentation History.
Toggle Abstract Paper (in Proceedings / .pdf)
Abstract: Consider an oracle which takes a point x and returns the minimizer of a convex function f in an l_2 ball of radius r around x. It is straightforward to show that roughly r^{-1}\log(1/epsilon) calls to the oracle suffice to find an \epsilon-approximate minimizer of f in an l_2 unit ball. Perhaps surprisingly, this is not optimal: we design an accelerated algorithm which attains an epsilon-approximate minimizer with roughly r^{-2/3} \log(1/epsilon) oracle queries, and give a matching lower bound. Further, we implement ball optimization oracles for functions with a locally stable Hessian using a variant of Newton's method and, in certain cases, stochastic first-order methods. The resulting algorithms apply to a number of problems of practical and theoretical import, improving upon previous results for logistic and l_infinity regression and achieving guarantees comparable to the state-of-the-art for l_p regression.