Timezone: »
Poster
SAPD+: An Accelerated Stochastic Method for Nonconvex-Concave Minimax Problems
Xuan Zhang · Necdet Serhat Aybat · Mert Gurbuzbalaban
We propose a new stochastic method SAPD+ for solving nonconvex-concave minimax problems of the form $\min\max\mathcal{L}(x,y)=f(x)+\Phi(x,y)-g(y)$, where $f,g$ are closed convex and $\Phi(x,y)$ is a smooth function that is weakly convex in $x$, (strongly) concave in $y$. For both strongly concave and merely concave settings, SAPD+ achieves the best known oracle complexities of $\mathcal{O}(L\kappa_y\epsilon^{-4})$ and $\mathcal{O}(L^3\epsilon^{-6})$, respectively, without assuming compactness of the problem domain, where $\kappa_y$ is the condition number, and $L$ is the Lipschitz constant. We also propose SAPD+ with variance reduction, which enjoys the best known oracle complexity of $\mathcal{O}(L\kappa_y^2\epsilon^{-3})$ for weakly convex-strongly concave setting. We demonstrate the efficiency of SAPD+ on a distributionally robust learning problem with a nonconvex regularizer and also on a multi-class classification problem in deep learning.
Author Information
Xuan Zhang (The Pennsylvania State University)
Necdet Serhat Aybat (Penn State University)
Mert Gurbuzbalaban (Rutgers University)
More from the Same Authors
-
2023 Poster: Uniform-in-Time Wasserstein Stability Bounds for (Noisy) Stochastic Gradient Descent »
Lingjiong Zhu · Mert Gurbuzbalaban · Anant Raj · Umut Simsekli -
2023 Workshop: Heavy Tails in ML: Structure, Stability, Dynamics »
Mert Gurbuzbalaban · Stefanie Jegelka · Michael Mahoney · Umut Simsekli -
2019 Poster: A Universally Optimal Multistage Accelerated Stochastic Gradient Method »
Necdet Serhat Aybat · Alireza Fallah · Mert Gurbuzbalaban · Asuman Ozdaglar -
2016 Poster: A primal-dual method for conic constrained distributed optimization problems »
Necdet Serhat Aybat · Erfan Yazdandoost Hamedani