`

Timezone: »

 
Poster
Scaling Neural Tangent Kernels via Sketching and Random Features
Amir Zandieh · Insu Han · Haim Avron · Neta Shoham · Chaewon Kim · Jinwoo Shin

Wed Dec 08 04:30 PM -- 06:00 PM (PST) @ None #None

The Neural Tangent Kernel (NTK) characterizes the behavior of infinitely-wide neural networks trained under least squares loss by gradient descent. Recent works also report that NTK regression can outperform finitely-wide neural networks trained on small-scale datasets. However, the computational complexity of kernel methods has limited its use in large-scale learning tasks. To accelerate learning with NTK, we design a near input-sparsity time approximation algorithm for NTK, by sketching the polynomial expansions of arc-cosine kernels: our sketch for the convolutional counterpart of NTK (CNTK) can transform any image using a linear runtime in the number of pixels. Furthermore, we prove a spectral approximation guarantee for the NTK matrix, by combining random features (based on leverage score sampling) of the arc-cosine kernels with a sketching algorithm. We benchmark our methods on various large-scale regression and classification tasks and show that a linear regressor trained on our CNTK features matches the accuracy of exact CNTK on CIFAR-10 dataset while achieving 150x speedup.

Author Information

Amir Zandieh (epfl)
Insu Han (Yale University)
Haim Avron (Tel Aviv University)
Neta Shoham (Edgify)
chaewon Kim (KAIST)
Jinwoo Shin (KAIST)

More from the Same Authors