Skip to yearly menu bar Skip to main content

Workshop: OPT 2023: Optimization for Machine Learning

Almost multisecant BFGS quasi-Newton method

Mokhwa Lee · Yifan Sun


Quasi-Newton (QN) methods provide an alternative to second-order techniques for solving minimization problems by approximating curvature. This approach reduces computational complexity as it relies solely on first-order information, and satisfying the secant condition. This paper focuses on multi-secant (MS) extensions of QN, which enhances the Hessian approximation at low cost. Specifically, we use a low-rank perturbation strategy to construct an almost-secant QN method that maintains positive definiteness of the Hessian estimate, which in turn helps ensure constant descent (and reduces method divergence). Our results show that careful tuning of the updates greatly improve stability and effectiveness of multisecant updates.

Chat is not available.