Timezone: »

 
Oral
Stochastic Cubic Regularization for Fast Nonconvex Optimization
Nilesh Tripuraneni · Mitchell Stern · Chi Jin · Jeffrey Regier · Michael Jordan

Wed Dec 05 12:50 PM -- 01:05 PM (PST) @ Room 517 CD
This paper proposes a stochastic variant of a classic algorithm---the cubic-regularized Newton method [Nesterov and Polyak]. The proposed algorithm efficiently escapes saddle points and finds approximate local minima for general smooth, nonconvex functions in only $\mathcal{\tilde{O}}(\epsilon^{-3.5})$ stochastic gradient and stochastic Hessian-vector product evaluations. The latter can be computed as efficiently as stochastic gradients. This improves upon the $\mathcal{\tilde{O}}(\epsilon^{-4})$ rate of stochastic gradient descent. Our rate matches the best-known result for finding local minima without requiring any delicate acceleration or variance-reduction techniques.

Author Information

Nilesh Tripuraneni (UC Berkeley)
Mitchell Stern (UC Berkeley)
Chi Jin (University of California, Berkeley)
Jeffrey Regier (UC Berkeley)
Michael Jordan (UC Berkeley)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors