Timezone: »

 
Poster
Sparse Random Feature Algorithm as Coordinate Descent in Hilbert Space
Ian En-Hsu Yen · Ting-Wei Lin · Shou-De Lin · Pradeep Ravikumar · Inderjit Dhillon

Wed Dec 10 04:00 PM -- 08:59 PM (PST) @ Level 2, room 210D
In this paper, we propose a Sparse Random Feature algorithm, which learns a sparse non-linear predictor by minimizing an $\ell_1$-regularized objective function over the Hilbert Space induced from kernel function. By interpreting the algorithm as Randomized Coordinate Descent in the infinite-dimensional space, we show the proposed approach converges to a solution comparable within $\eps$-precision to exact kernel method by drawing $O(1/\eps)$ number of random features, contrasted to the $O(1/\eps^2)$-type convergence achieved by Monte-Carlo analysis in current Random Feature literature. In our experiments, the Sparse Random Feature algorithm obtains sparse solution that requires less memory and prediction time while maintains comparable performance on tasks of regression and classification. In the meantime, as an approximate solver for infinite-dimensional $\ell_1$-regularized problem, the randomized approach converges to better solution than Boosting approach when the greedy step of Boosting cannot be performed exactly.

Author Information

Ian En-Hsu Yen (University of Texas at Austin)
Ting-Wei Lin (National Taiwan University)
Shou-De Lin (National Taiwan University)
Pradeep Ravikumar (Carnegie Mellon University)
Inderjit Dhillon (Google & UT Austin)

More from the Same Authors