Skip to yearly menu bar Skip to main content


( events)   Timezone:  
Spotlight
Wed Dec 05 01:40 PM -- 01:45 PM (PST) @ Room 517 CD
Natasha 2: Faster Non-Convex Optimization Than SGD
Zeyuan Allen-Zhu
We design a stochastic algorithm to find $\varepsilon$-approximate local minima of any smooth nonconvex function in rate $O(\varepsilon^{-3.25})$, with only oracle access to stochastic gradients. The best result before this work was $O(\varepsilon^{-4})$ by stochastic gradient descent (SGD).