Processing math: 100%
Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Order up! The Benefits of Higher-Order Optimization in Machine Learning

Perseus: A Simple and Optimal High-Order Method for Variational Inequalities

Tianyi Lin · Michael Jordan


Abstract: This paper settles an open and challenging question pertaining to the design of simple high-order regularization methods for solving smooth and monotone variational inequalities (VIs). A VI involves finding x\XCal such that F(x),xx0 for all x\XCal and we consider the setting where F:\brd\brd is smooth with up to (p1)\textnormalth-order derivatives. High-order methods based on similar binary search procedures have been further developed and shown to achieve a rate of O(ϵ2/(p+1)log(1/ϵ))~\citep{Bullins-2020-Higher,Lin-2021-Monotone,Jiang-2022-Generalized}. However, such search procedure can be computationally prohibitive in practice~\citep{Nesterov-2018-Lectures} and the problem of finding a simple high-order regularization methods remains as an open and challenging question in the optimization theory. We propose a p\textnormalth-order method that does \textit{not} require any binary search procedure and prove that it can converge to a weak solution at a global rate of O(ϵ2/(p+1)). A lower bound of Ω(ϵ2/(p+1)) is also established under a linear span assumption to show that our p\textnormalth-order method is optimal in the monotone setting. A version with restarting attains a global linear and local superlinear convergence rate for smooth and strongly monotone VIs. Our method can achieve a global rate of O(ϵ2/p) for solving smooth and non-monotone VIs satisfying the Minty condition. The restarted version again attains a global linear and local superlinear convergence rate if the strong Minty condition holds.

Chat is not available.