Timezone: »
Spotlight
Random Features for Large-Scale Kernel Machines
ali rahimi · Benjamin Recht
To accelerate the training of kernel machines, we propose to map the input data to a randomized low-dimensional feature space and then apply existing fast linear methods. The features are designed so that the inner products of the transformed data are approximately equal to those in the feature space of a user specified shift-invariant kernel. We explore two sets of random features, provide convergence bounds on their ability to approximate various radial basis kernels, and show that in large-scale classification and regression tasks linear machine learning algorithms applied to these features outperform state-of-the-art large-scale kernel machines.
Author Information
ali rahimi (Intel)
Benjamin Recht (California Institute of Technology)
Related Events (a corresponding poster, oral, or spotlight)
-
2007 Poster: Random Features for Large-Scale Kernel Machines »
Tue. Dec 4th 06:30 -- 06:40 PM Room
More from the Same Authors
-
2010 Poster: Random Conic Pursuit for Semidefinite Programming »
Ariel Kleiner · ali rahimi · Michael Jordan -
2008 Poster: Weighted Sums of Random Kitchen Sinks: Replacing minimization with randomization in learning »
ali rahimi · Benjamin Recht -
2008 Spotlight: Weighted Sums of Random Kitchen Sinks: Replacing minimization with randomization in learning »
ali rahimi · Benjamin Recht -
2006 Poster: Estimating Observation Functions in Dynamical Systems Using Unsupervised Regression »
ali rahimi · Benjamin Recht