Timezone: »
In this paper, we propose an accelerated first-order method for geodesically convex optimization, which is the generalization of the standard Nesterov's accelerated method from Euclidean space to nonlinear Riemannian space. We first derive two equations and obtain two nonlinear operators for geodesically convex optimization instead of the linear extrapolation step in Euclidean space. In particular, we analyze the global convergence properties of our accelerated method for geodesically strongly-convex problems, which show that our method improves the convergence rate from O((1-\mu/L)^{k}) to O((1-\sqrt{\mu/L})^{k}). Moreover, our method also improves the global convergence rate on geodesically general convex problems from O(1/k) to O(1/k^{2}). Finally, we give a specific iterative scheme for matrix Karcher mean problems, and validate our theoretical results with experiments.
Author Information
Yuanyuan Liu (The Chinese University of Hong Kong)
Fanhua Shang (The Chinese University of Hong Kong)
Fanhua Shang is currently a Research Associate with the Department of Computer Science and Engineering, The Chinese University of Hong Kong. From 2013 to 2015, he was a Post-Doctoral Research Fellow with the Department of Computer Science and Engineering, The Chinese University of Hong Kong. From 2012 to 2013, he was a Post-Doctoral Research Associate with the Department of Electrical and Computer Engineering, Duke University, Durham, NC, USA. He received the Ph.D. degree in Circuits and Systems from Xidian University, Xi'an, China, in 2012. His current research interests include machine learning, data mining, artificial intelligence, and computer vision.
James Cheng (The Chinese University of Hong Kong)
Hong Cheng (The Chinese University of Hong Kong)
Licheng Jiao (Xidian University)
More from the Same Authors
-
2021 Poster: Deconvolutional Networks on Graph Data »
Jia Li · Jiajin Li · Yang Liu · Jianwei Yu · Yueting Li · Hong Cheng -
2020 Poster: Dirichlet Graph Variational Autoencoder »
Jia Li · Jianwei Yu · Jiajin Li · Honglei Zhang · Kangfei Zhao · Yu Rong · Hong Cheng · Junzhou Huang -
2020 Poster: Boosting First-Order Methods by Shifting Objective: New Schemes with Faster Worst-Case Rates »
Kaiwen Zhou · Anthony Man-Cho So · James Cheng -
2018 Poster: Norm-Ranging LSH for Maximum Inner Product Search »
Xiao Yan · Jinfeng Li · Xinyan Dai · Hongzhi Chen · James Cheng -
2014 Poster: Generalized Higher-Order Orthogonal Iteration for Tensor Decomposition and Completion »
Yuanyuan Liu · Fanhua Shang · Wei Fan · James Cheng · Hong Cheng