Timezone: »

 
Poster
Accelerated Stochastic Greedy Coordinate Descent by Soft Thresholding Projection onto Simplex
Chaobing Song · Shaobo Cui · Yong Jiang · Shu-Tao Xia

Wed Dec 06 06:30 PM -- 10:30 PM (PST) @ Pacific Ballroom #171
In this paper we study the well-known greedy coordinate descent (GCD) algorithm to solve $\ell_1$-regularized problems and improve GCD by the two popular strategies: Nesterov's acceleration and stochastic optimization. Firstly, we propose a new rule for greedy selection based on an $\ell_1$-norm square approximation which is nontrivial to solve but convex; then an efficient algorithm called ``SOft ThreshOlding PrOjection (SOTOPO)'' is proposed to exactly solve the $\ell_1$-regularized $\ell_1$-norm square approximation problem, which is induced by the new rule. Based on the new rule and the SOTOPO algorithm, the Nesterov's acceleration and stochastic optimization strategies are then successfully applied to the GCD algorithm. The resulted algorithm called accelerated stochastic greedy coordinate descent (ASGCD) has the optimal convergence rate $O(\sqrt{1/\epsilon})$; meanwhile, it reduces the iteration complexity of greedy selection up to a factor of sample size. Both theoretically and empirically, we show that ASGCD has better performance for high-dimensional and dense problems with sparse solution.

Author Information

Chaobing Song (Tsinghua University)
Shaobo Cui (Tsinghua University)
Yong Jiang (Tsinghua-Berkeley Shenzhen Institute)

Yong Jiang is the Professor in the Graduate School at Shenzhen of Tsinghua University. He received his Ph.D. from the Tsinghua University in 2002. Professor Yong Jiang’s research interests range from Computer Network Architecture to Internet Applications, with current foci on Internet architecture and Machine learning.

Shu-Tao Xia (Tsinghua University)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors