Poster
in
Workshop: OPT 2023: Optimization for Machine Learning
Accelerated Methods for Riemannian Min-Max Optimization Ensuring Bounded Geometric Penalties
David Martinez-Rubio · Christophe Roux · Christopher Criscitiello · Sebastian Pokutta
Abstract:
In this work, we study optimization problems of the form , where is defined on a product Riemannian manifold and is -strongly geodesically convex (g-convex) in and -strongly g-concave in , for . We design accelerated methods when is -smooth and , are Hadamard. To that aim we introduce new g-convex optimization results, of independent interest: we show global linear convergence for metric-projected Riemannian gradient descent and improve existing accelerated methods by reducing geometric constants. Additionally, we complete the analysis of two previous works applying to the Riemannian min-max case by removing an assumption about iterates staying in a pre-specified compact set.
Chat is not available.