Timezone: »

On Quadratic Convergence of DC Proximal Newton Algorithm in Nonconvex Sparse Learning
Xingguo Li · Lin Yang · Jason Ge · Jarvis Haupt · Tong Zhang · Tuo Zhao

Mon Dec 04 06:30 PM -- 10:30 PM (PST) @ Pacific Ballroom #54

We propose a DC proximal Newton algorithm for solving nonconvex regularized sparse learning problems in high dimensions. Our proposed algorithm integrates the proximal newton algorithm with multi-stage convex relaxation based on difference of convex (DC) programming, and enjoys both strong computational and statistical guarantees. Specifically, by leveraging a sophisticated characterization of sparse modeling structures (i.e., local restricted strong convexity and Hessian smoothness), we prove that within each stage of convex relaxation, our proposed algorithm achieves (local) quadratic convergence, and eventually obtains a sparse approximate local optimum with optimal statistical properties after only a few convex relaxations. Numerical experiments are provided to support our theory.

Author Information

Xingguo Li (Princeton University)
Lin Yang (Johns Hopkins University)
Jason Ge (Princeton University)
Jarvis Haupt (University of Minnesota)
Tong Zhang (Tencent AI Lab)
Tuo Zhao (Georgia Tech)

More from the Same Authors