Timezone: »

 
Poster
Greedy and Random Quasi-Newton Methods with Faster Explicit Superlinear Convergence
Dachao Lin · Haishan Ye · Zhihua Zhang

Wed Dec 08 12:30 AM -- 02:00 AM (PST) @

In this paper, we follow Rodomanov and Nesterov’s work to study quasi-Newton methods. We focus on the common SR1 and BFGS quasi-Newton methods to establish better explicit (local) superlinear convergence rates. First, based on the greedy quasi-Newton update which greedily selects the direction to maximize a certain measure of progress, we improve the convergence rate to a condition-number-free superlinear convergence rate. Second, based on the random quasi-Newton update that selects the direction randomly from a spherically symmetric distribution, we show the same superlinear convergence rate established as above. Our analysis is closely related to the approximation of a given Hessian matrix, unconstrained quadratic objective, as well as the general strongly convex, smooth, and strongly self-concordant functions.

Author Information

Dachao Lin (Peking University)
Haishan Ye (The Chinese University of Hong Kong, Shenzen)
Zhihua Zhang (Shanghai Jiao Tong University)

More from the Same Authors