Timezone: »

 
Poster
Third-order Smoothness Helps: Faster Stochastic Optimization Algorithms for Finding Local Minima
Yaodong Yu · Pan Xu · Quanquan Gu

Thu Dec 06 02:00 PM -- 04:00 PM (PST) @ Room 210 #18
We propose stochastic optimization algorithms that can find local minima faster than existing algorithms for nonconvex optimization problems, by exploiting the third-order smoothness to escape non-degenerate saddle points more efficiently. More specifically, the proposed algorithm only needs $\tilde{O}(\epsilon^{-10/3})$ stochastic gradient evaluations to converge to an approximate local minimum $\mathbf{x}$, which satisfies $\|\nabla f(\mathbf{x})\|_2\leq\epsilon$ and $\lambda_{\min}(\nabla^2 f(\mathbf{x}))\geq -\sqrt{\epsilon}$ in unconstrained stochastic optimization, where $\tilde{O}(\cdot)$ hides logarithm polynomial terms and constants. This improves upon the $\tilde{O}(\epsilon^{-7/2})$ gradient complexity achieved by the state-of-the-art stochastic local minima finding algorithms by a factor of $\tilde{O}(\epsilon^{-1/6})$. Experiments on two nonconvex optimization problems demonstrate the effectiveness of our algorithm and corroborate our theory.

Author Information

Yaodong Yu (University of Virginia / Petuum)
Pan Xu (UCLA)
Quanquan Gu (UCLA)

More from the Same Authors