Skip to yearly menu bar Skip to main content


Poster

A gradient sampling method with complexity guarantees for Lipschitz functions in high and low dimensions

Damek Davis · Dmitriy Drusvyatskiy · Yin Tat Lee · Swati Padmanabhan · Guanghao Ye

Hall J (level 1) #837

Keywords: [ nonconvex optimization ] [ nonsmooth optimization ] [ Goldstein subdifferential ] [ nonconvex nonsmooth optimization ] [ cutting plane method. ]


Abstract: Zhang et al. (ICML 2020) introduced a novel modification of Goldstein's classical subgradient method, with an efficiency guarantee of $O(\varepsilon^{-4})$ for minimizing Lipschitz functions. Their work, however, makes use of an oracle that is not efficiently implementable. In this paper, we obtain the same efficiency guarantee with a standard subgradient oracle, thus making our algorithm efficiently implementable. Our resulting method works on any Lipschitz function whose value and gradient can be evaluated at points of differentiability. We additionally present a new cutting plane algorithm that achieves an efficiency of $O(d\varepsilon^{-2}\log S)$ for the class of $S$-smooth (and possibly non-convex) functions in low dimensions. Strikingly, this $\epsilon$-dependence matches the lower bounds for the convex setting.

Chat is not available.