Skip to yearly menu bar Skip to main content


Poster

Accelerated First-order Methods for Geodesically Convex Optimization on Riemannian Manifolds

Yuanyuan Liu · Fanhua Shang · James Cheng · Hong Cheng · Licheng Jiao

Pacific Ballroom #166

Keywords: [ Non-Convex Optimization ]


Abstract:

In this paper, we propose an accelerated first-order method for geodesically convex optimization, which is the generalization of the standard Nesterov's accelerated method from Euclidean space to nonlinear Riemannian space. We first derive two equations and obtain two nonlinear operators for geodesically convex optimization instead of the linear extrapolation step in Euclidean space. In particular, we analyze the global convergence properties of our accelerated method for geodesically strongly-convex problems, which show that our method improves the convergence rate from O((1-\mu/L)^{k}) to O((1-\sqrt{\mu/L})^{k}). Moreover, our method also improves the global convergence rate on geodesically general convex problems from O(1/k) to O(1/k^{2}). Finally, we give a specific iterative scheme for matrix Karcher mean problems, and validate our theoretical results with experiments.

Live content is unavailable. Log in and register to view live content