Timezone: »

 
Poster
Hybrid Variance-Reduced SGD Algorithms For Minimax Problems with Nonconvex-Linear Function
Quoc Tran Dinh · Deyi Liu · Lam Nguyen

Wed Dec 09 09:00 AM -- 11:00 AM (PST) @ Poster Session 3 #818
We develop a novel and single-loop variance-reduced algorithm to solve a class of stochastic nonconvex-convex minimax problems involving a nonconvex-linear objective function, which has various applications in different fields such as ma- chine learning and robust optimization. This problem class has several compu- tational challenges due to its nonsmoothness, nonconvexity, nonlinearity, and non-separability of the objective functions. Our approach relies on a new combi- nation of recent ideas, including smoothing and hybrid biased variance-reduced techniques. Our algorithm and its variants can achieve $\mathcal{O}(T^{-2/3})$-convergence rate and the best-known oracle complexity under standard assumptions, where T is the iteration counter. They have several computational advantages compared to exist- ing methods such as simple to implement and less parameter tuning requirements. They can also work with both single sample or mini-batch on derivative estimators, and with constant or diminishing step-sizes. We demonstrate the benefits of our algorithms over existing methods through two numerical examples, including a nonsmooth and nonconvex-non-strongly concave minimax model.

Author Information

Quoc Tran Dinh (Department of Statistics and Operations Research, University of North Carolina at Chapel Hill, North Carolina)
Deyi Liu (University of North Carolina)
Lam Nguyen (IBM Research, Thomas J. Watson Research Center)

More from the Same Authors