Skip to yearly menu bar Skip to main content


Plenary Speaker
in
Workshop: OPT 2021: Optimization for Machine Learning

Avoiding saddle points in nonsmooth optimization, Damek Davis

Damek Davis


Abstract:

Abstract: We introduce a geometrically transparent strict saddle property for nonsmooth functions. When present, this property guarantees that simple randomly initialized proximal algorithms on weakly convex problems converge only to local minimizers. We argue that the strict saddle property may be a realistic assumption in applications since it provably holds for generic semi-algebraic optimization problems. Finally, we close the talk with an extension of the result to "perturbed" subgradient methods.