Timezone: »
Adaptive gradient methods have shown their ability to adjust the stepsizes on the fly in a parameter-agnostic manner, and empirically achieve faster convergence for solving minimization problems. When it comes to nonconvex minimax optimization, however, current convergence analyses of gradient descent ascent (GDA) combined with adaptive stepsizes require careful tuning of hyper-parameters and the knowledge of problem-dependent parameters. Such a discrepancy arises from the primal-dual nature of minimax problems and the necessity of delicate time-scale separation between the primal and dual updates in attaining convergence. In this work, we propose a single-loop adaptive GDA algorithm called TiAda for nonconvex minimax optimization that automatically adapts to the time-scale separation. Our algorithm is fully parameter-agnostic and can achieve near-optimal complexities simultaneously in deterministic and stochastic settings of nonconvex-strongly-concave minimax problems. The effectiveness of the proposed method is further justified numerically for a number of machine learning applications.
Author Information
Xiang Li (ETH Zurich)
Junchi YANG (ETH Zurich)
Niao He (ETH Zurich)
Related Events (a corresponding poster, oral, or spotlight)
-
2022 : TiAda: A Time-scale Adaptive Algorithm For Nonconvex Minimax Optimization »
Dates n/a. Room
More from the Same Authors
-
2022 : Uniform Convergence and Generalization for Nonconvex Stochastic Minimax Problems »
Siqi Zhang · Yifan Hu · Liang Zhang · Niao He -
2022 : Poster Session 2 »
Jinwuk Seok · Bo Liu · Ryotaro Mitsuboshi · David Martinez-Rubio · Weiqiang Zheng · Ilgee Hong · Chen Fan · Kazusato Oko · Bo Tang · Miao Cheng · Aaron Defazio · Tim G. J. Rudner · Gabriele Farina · Vishwak Srinivasan · Ruichen Jiang · Peng Wang · Jane Lee · Nathan Wycoff · Nikhil Ghosh · Yinbin Han · David Mueller · Liu Yang · Amrutha Varshini Ramesh · Siqi Zhang · Kaifeng Lyu · David Yunis · Kumar Kshitij Patel · Fangshuo Liao · Dmitrii Avdiukhin · Xiang Li · Sattar Vakili · Jiaxin Shi -
2022 : Niao He, Simple Fixes for Adaptive Gradient Methods for Nonconvex Min-Max Optimization »
Niao He -
2022 Poster: Nest Your Adaptive Algorithm for Parameter-Agnostic Nonconvex Minimax Optimization »
Junchi YANG · Xiang Li · Niao He -
2022 Poster: Sharp Analysis of Stochastic Optimization under Global Kurdyka-Lojasiewicz Inequality »
Ilyas Fatkhullin · Jalal Etesami · Niao He · Negar Kiyavash -
2022 Poster: Bring Your Own Algorithm for Optimal Differentially Private Stochastic Minimax Optimization »
Liang Zhang · Kiran Thekumparampil · Sewoong Oh · Niao He -
2022 Poster: Stochastic Second-Order Methods Improve Best-Known Sample Complexity of SGD for Gradient-Dominated Functions »
Saeed Masiha · Saber Salehkaleybar · Niao He · Negar Kiyavash · Patrick Thiran -
2020 Poster: A Catalyst Framework for Minimax Optimization »
Junchi Yang · Siqi Zhang · Negar Kiyavash · Niao He -
2020 Poster: Global Convergence and Variance Reduction for a Class of Nonconvex-Nonconcave Minimax Problems »
Junchi Yang · Negar Kiyavash · Niao He