Poster
Tight Dimensionality Reduction for Sketching Low Degree Polynomial Kernels
Michela Meister · Tamas Sarlos · David Woodruff
East Exhibition Hall B, C #10
Keywords: [ Algorithms ] [ Kernel Methods ]
[
Abstract
]
Abstract:
We revisit the classic randomized sketch of a tensor product of vectors . The -th coordinate of the sketch is equal to
, where are independent random sign vectors. Kar and Karnick (JMLR, 2012) show that
if the sketching dimension , where is a certain property of the point set one wants to sketch, then with probability , for all . However, in their analysis can be as large as , even for a set of vectors .
We give a new analysis of this sketch, providing nearly optimal bounds.
Namely, we show an upper bound of
which by composing with CountSketch, can be improved to
. For the important case of and , this shows that ,
demonstrating that the and terms do not multiply each other. We also show a nearly matching lower bound of
.
In a number of applications, one has and in this case our bounds are optimal up to a constant factor. This is the first high probability sketch for tensor products that has optimal sketch size and can be implemented in time, where is the
number of non-zero entries of .
Lastly, we empirically compare our sketch to other sketches for tensor products, and give a novel application to compressing neural networks.
Live content is unavailable. Log in and register to view live content