Timezone: »
Motivated by energy based analyses for descent methods in the Euclidean setting, we investigate a generalisation of such energy based analyses for descent methods over Riemannian manifolds. In doing so, we find that it is possible to derive curvature-free guarantees for such descent methods, improving on work by Zhang and Sra [2016]. This analysis allows us to study acceleration of Riemannian gradient descent in the geodesically-convex setting, and improve on an existing result by Alimisis et al 2021]. Finally, extending the analysis by Ahn and Sra [2020], we attempt to provide some sufficient conditions for the acceleration of Riemannian descent methods in the strongly geodesically convex setting.
Author Information
Vishwak Srinivasan (Massachusetts Institute of Technology)
Ashia Wilson (MIT)
More from the Same Authors
-
2022 : Sufficient conditions for non-asymptotic convergence of Riemannian optimization methods »
Vishwak Srinivasan · Ashia Wilson -
2022 : Poster Session 2 »
Jinwuk Seok · Bo Liu · Ryotaro Mitsuboshi · David Martinez-Rubio · Weiqiang Zheng · Ilgee Hong · Chen Fan · Kazusato Oko · Bo Tang · Miao Cheng · Aaron Defazio · Tim G. J. Rudner · Gabriele Farina · Vishwak Srinivasan · Ruichen Jiang · Peng Wang · Jane Lee · Nathan Wycoff · Nikhil Ghosh · Yinbin Han · David Mueller · Liu Yang · Amrutha Varshini Ramesh · Siqi Zhang · Kaifeng Lyu · David Yunis · Kumar Kshitij Patel · Fangshuo Liao · Dmitrii Avdiukhin · Xiang Li · Sattar Vakili · Jiaxin Shi -
2022 : Contributed Talks 3 »
Cristóbal Guzmán · Fangshuo Liao · Vishwak Srinivasan · Zhiyuan Li -
2022 Poster: Algorithms that Approximate Data Removal: New Results and Limitations »
Vinith Suriyakumar · Ashia Wilson -
2020 Poster: On Learning Ising Models under Huber's Contamination Model »
Adarsh Prasad · Vishwak Srinivasan · Sivaraman Balakrishnan · Pradeep Ravikumar