Skip to yearly menu bar Skip to main content


Poster

Practical Data-Dependent Metric Compression with Provable Guarantees

Piotr Indyk · Ilya Razenshteyn · Tal Wagner

Pacific Ballroom #24

Keywords: [ Nonlinear Dimensionality Reduction and Manifold Learning ] [ Similarity and Distance Learning ]


Abstract:

We introduce a new distance-preserving compact representation of multi-dimensional point-sets. Given n points in a d-dimensional space where each coordinate is represented using B bits (i.e., dB bits per point), it produces a representation of size O( d log(d B/epsilon) +log n) bits per point from which one can approximate the distances up to a factor of 1 + epsilon. Our algorithm almost matches the recent bound of Indyk et al, 2017} while being much simpler. We compare our algorithm to Product Quantization (PQ) (Jegou et al, 2011) a state of the art heuristic metric compression method. We evaluate both algorithms on several data sets: SIFT, MNIST, New York City taxi time series and a synthetic one-dimensional data set embedded in a high-dimensional space. Our algorithm produces representations that are comparable to or better than those produced by PQ, while having provable guarantees on its performance.

Live content is unavailable. Log in and register to view live content