Timezone: »

Faster Ridge Regression via the Subsampled Randomized Hadamard Transform
Yichao Lu · Paramveer Dhillon · Dean P Foster · Lyle Ungar

Sat Dec 07 07:00 PM -- 11:59 PM (PST) @ Harrah's Special Events Center, 2nd Floor
We propose a fast algorithm for ridge regression when the number of features is much larger than the number of observations ($p \gg n$). The standard way to solve ridge regression in this setting works in the dual space and gives a running time of $O(n^2p)$. Our algorithm (SRHT-DRR) runs in time $O(np\log(n))$ and works by preconditioning the design matrix by a Randomized Walsh-Hadamard Transform with a subsequent subsampling of features. We provide risk bounds for our SRHT-DRR algorithm in the fixed design setting and show experimental results on synthetic and real datasets.

Author Information

Yichao Lu (University of Pennsylvania)
Paramveer Dhillon (University of Pennsylvania)
Dean P Foster (University of Pennsylvania)
Lyle Ungar (University of Pennsylvania)

More from the Same Authors