Timezone: »
Poster
Is Input Sparsity Time Possible for Kernel Low-Rank Approximation?
Cameron Musco · David Woodruff
Low-rank approximation is a common tool used to accelerate kernel methods: the $n \times n$ kernel matrix $K$ is approximated via a rank-$k$ matrix $\tilde K$ which can be stored in much less space and processed more quickly. In this work we study the limits of computationally efficient low-rank kernel approximation. We show that for a broad class of kernels, including the popular Gaussian and polynomial kernels, computing a relative error $k$-rank approximation to $K$ is at least as difficult as multiplying the input data matrix $A \in R^{n \times d}$ by an arbitrary matrix $C \in R^{d \times k}$. Barring a breakthrough in fast matrix multiplication, when $k$ is not too large, this requires $\Omega(nnz(A)k)$ time where $nnz(A)$ is the number of non-zeros in $A$. This lower bound matches, in many parameter regimes, recent work on subquadratic time algorithms for low-rank approximation of general kernels [MM16,MW17], demonstrating that these algorithms are unlikely to be significantly improved, in particular to $O(nnz(A))$ input sparsity runtimes. At the same time there is hope: we show for the first time that $O(nnz(A))$ time approximation is possible for general radial basis function kernels (e.g., the Gaussian kernel) for the closely related problem of low-rank approximation of the kernelized dataset.
Author Information
Cameron Musco (Massachusetts Institute of Technology)
David Woodruff (Carnegie Mellon University)
More from the Same Authors
-
2021 Spotlight: Optimal Sketching for Trace Estimation »
Shuli Jiang · Hai Pham · David Woodruff · Richard Zhang -
2021 Poster: Linear and Kernel Classification in the Streaming Model: Improved Bounds for Heavy Hitters »
Arvind Mahankali · David Woodruff -
2021 Poster: Optimal Sketching for Trace Estimation »
Shuli Jiang · Hai Pham · David Woodruff · Richard Zhang -
2021 Poster: Few-Shot Data-Driven Algorithms for Low Rank Approximation »
Piotr Indyk · Tal Wagner · David Woodruff -
2020 Poster: Revisiting the Sample Complexity of Sparse Spectrum Approximation of Gaussian Processes »
Minh Hoang · Nghia Hoang · Hai Pham · David Woodruff -
2018 Poster: Inferring Networks From Random Walk-Based Node Similarities »
Jeremy Hoskins · Cameron Musco · Christopher Musco · Charalampos Tsourakakis -
2017 Poster: Approximation Algorithms for $\ell_0$-Low Rank Approximation »
Karl Bringmann · Pavel Kolev · David Woodruff -
2017 Poster: Near Optimal Sketching of Low-Rank Tensor Regression »
Xingguo Li · Jarvis Haupt · David Woodruff -
2017 Poster: Recursive Sampling for the Nystrom Method »
Cameron Musco · Christopher Musco -
2015 Poster: Randomized Block Krylov Methods for Stronger and Faster Approximate Singular Value Decomposition »
Cameron Musco · Christopher Musco -
2015 Oral: Randomized Block Krylov Methods for Stronger and Faster Approximate Singular Value Decomposition »
Cameron Musco · Christopher Musco