Timezone: »

 
Poster
Efficiently avoiding saddle points with zero order methods: No gradients required
Emmanouil-Vasileios Vlatakis-Gkaragkounis · Lampros Flokas · Georgios Piliouras

Thu Dec 12 10:45 AM -- 12:45 PM (PST) @ East Exhibition Hall B + C #198
We consider the case of derivative-free algorithms for non-convex optimization, also known as zero order algorithms, that use only function evaluations rather than gradients. For a wide variety of gradient approximators based on finite differences, we establish asymptotic convergence to second order stationary points using a carefully tailored application of the Stable Manifold Theorem. Regarding efficiency, we introduce a noisy zero-order method that converges to second order stationary points, i.e avoids saddle points. Our algorithm uses only $\tilde{\mathcal{O}}(1 / \epsilon^2)$ approximate gradient calculations and, thus, it matches the converge rate guarantees of their exact gradient counterparts up to constants. In contrast to previous work, our convergence rate analysis avoids imposing additional dimension dependent slowdowns in the number of iterations required for non-convex zero order optimization.

Author Information

Emmanouil-Vasileios Vlatakis-Gkaragkounis (Columbia University)
Lampros Flokas (Columbia University)
Georgios Piliouras (Singapore University of Technology and Design)

More from the Same Authors