Timezone: »

Less is More: Nyström Computational Regularization
Alessandro Rudi · Raffaello Camoriano · Lorenzo Rosasco

Thu Dec 10 08:00 AM -- 12:00 PM (PST) @ 210 C #63

We study Nyström type subsampling approaches to large scale kernel methods, and prove learning bounds in the statistical learning setting, where random sampling and high probability estimates are considered. In particular, we prove that these approaches can achieve optimal learning bounds, provided the subsampling level is suitably chosen. These results suggest a simple incremental variant of Nyström kernel ridge regression, where the subsampling level controls at the same time regularization and computations. Extensive experimental analysis shows that the considered approach achieves state of the art performances on benchmark large scale datasets.

Author Information

Alessandro Rudi (University of Genova)
Raffaello Camoriano (IIT - UNIGE)

Machine Learning and Robotics Postdoctoral Researcher with a strong Computer Science and Engineering background, focusing on scalable algorithms for predictive modeling, incremental lifelong learning and applications in robotics, visual recognition and dynamics learning.

Lorenzo Rosasco (University of Genova)

More from the Same Authors