Poster
in
Workshop: OPT 2022: Optimization for Machine Learning
Solving Constrained Variational Inequalities via a First-order Interior Point-based Method
Tong Yang · Michael Jordan · Tatjana Chavdarova
Abstract:
We focus on the open problem to develop a first-order method that can solve constrained Variational Inequality (cVI) problems when given general constraints. We generalize the \textit{alternating direction method of multipliers} (ADMM) method and combine it with interior-point approaches, yielding a first-order method that we refer to as ADMM-based interior-point method for cVIs (ACVI). We provide convergence guarantees for ACVI in two general classes of problems: (i) when the operator is -monotone, and (i) when it is monotone, some constraints are active and the game is not purely rotational. When the operator is, in addition, -Lipschitz for the latter case, we match known lower bounds on rates for the gap function of and for the last and average iterate, respectively. To our knowledge, this is the first \emph{first}-order interior-point method for the general cVI problem that has a global convergence guarantee. Empirical analyses demonstrate clear advantages of ACVI over common first-order methods. In particular, (i) cyclical behavior is notably reduced as our method approaches the solution from the analytic center, and (ii) unlike projection-based methods that zigzag when near a constraint, ACVI efficiently handles the constraints.
Chat is not available.