Timezone: »

 
Poster
Tight Analysis of Extra-gradient and Optimistic Gradient Methods For Nonconvex Minimax Problems
Pouria Mahdavinia · Yuyang Deng · Haochuan Li · Mehrdad Mahdavi

Wed Nov 30 02:00 PM -- 04:00 PM (PST) @ Hall J #602

Despite the established convergence theory of Optimistic Gradient Descent Ascent (OGDA) and Extragradient (EG) methods for the convex-concave minimax problems, little is known about the theoretical guarantees of these methods in nonconvex settings. To bridge this gap, for the first time, this paper establishes the convergence of OGDA and EG methods under the nonconvex-strongly-concave (NC-SC) and nonconvex-concave (NC-C) settings by providing a unified analysis through the lens of single-call extra-gradient methods. We further establish lower bounds on the convergence of GDA/OGDA/EG, shedding light on the tightness of our analysis. We also conduct experiments supporting our theoretical results. We believe our results will advance the theoretical understanding of OGDA and EG methods for solving complicated nonconvex minimax real-world problems, e.g., Generative Adversarial Networks (GANs) or robust neural networks training.

Author Information

Pouria Mahdavinia (Penn State University)
Yuyang Deng (Penn State)
Haochuan Li (MIT)
Mehrdad Mahdavi (Pennsylvania State University)

Mehrdad Mahdavi is an Assistant Professor of Computer Science & Engineering at Pennsylvania State University. He runs the Machine Learning and Optimization Lab, where they work on fundamental problems in computational and theoretical machine learning.

More from the Same Authors