Skip to yearly menu bar Skip to main content


Poster
in
Affinity Workshop: WiML Workshop 1

Accelerating Symmetric Rank 1 Quasi-Newton Method with Nesterov's Gradient

Indrapriyadarsini Sendilkkumaar · Hiroshi Ninomiya · Takeshi Kamio · Hideki Asai


Abstract:

Second order methods have shown to have better convergence than first order methods in several highly non-linear problems. However, the computational cost incurred has been a major drawback and thus quasi-Newton methods have been popularly used. Among the quasi-Newton methods, the BFGS method is widely used in training neural networks. Recently NAQ was proposed to accelerate the BFGS method using the Nesterov's accelerated gradient and momentum terms. In this study, we explore if the Nesterov's acceleration can be applied to other quasi-Newton methods as well. Thus, this paper proposes a Nesterov's accelerated LSR1 (L-SR1-N) and momentum accelerated LSR1 (L-MoSR1) methods for training neural networks.

Chat is not available.