Cubic Regularized Quasi-Newton Methods
Klea Ziu
2022 Spotlight Talk
in
Workshop: Order up! The Benefits of Higher-Order Optimization in Machine Learning
in
Workshop: Order up! The Benefits of Higher-Order Optimization in Machine Learning
Abstract
In this paper, we propose a Cubic Regularized L-BFGS. Cubic Regularized Newton outperforms the classical Newton method in terms of global performance. In classics, L-BFGS approximation is applied for the Newton method. We propose a new variant of inexact Cubic Regularized Newton. Then, we use L-BFGS approximation as an inexact Hessian for Cubic Regularized Newton. It allows us to get better theoretical convergence rates and good practical performance, especially from the points where classical Newton is diverging.
Video
Chat is not available.
Successful Page Load