Timezone: »

 
Poster
Natasha 2: Faster Non-Convex Optimization Than SGD
Zeyuan Allen-Zhu

Wed Dec 05 02:00 PM -- 04:00 PM (PST) @ Room 210 #50
We design a stochastic algorithm to find $\varepsilon$-approximate local minima of any smooth nonconvex function in rate $O(\varepsilon^{-3.25})$, with only oracle access to stochastic gradients. The best result before this work was $O(\varepsilon^{-4})$ by stochastic gradient descent (SGD).

Author Information

Zeyuan Allen-Zhu (Microsoft Research)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors