Timezone: »

Poster
Improved Coresets for Euclidean $k$-Means
Vincent Cohen-Addad · Kasper Green Larsen · David Saulpic · Chris Schwiegelshohn · Omar Ali Sheikh-Omar

Thu Dec 01 02:00 PM -- 04:00 PM (PST) @ Hall J #820
Given a set of $n$ points in $d$ dimensions, the Euclidean $k$-means problem consists of finding $k$ centers such that the sum of squared distances from every point to its closest center is minimized. The arguably most popular way of dealing with this problem in the big data setting is to first compress the data by computing a weighted subset known as a coreset and then run any algorithm on this subset. The guarantee of the coreset is that for any candidate solution, the ratio between coreset cost and the cost of the original instance is less than a $(1\pm \varepsilon)$ factor. The current state of the art coreset size for Euclidean $k$-means is $\tilde O(\min(k^{2} \cdot \varepsilon^{-2},k\cdot \varepsilon^{-4}))$. This matches the lower bound of $\Omega(k \varepsilon^{-2})$ up to a $\min(k,\varepsilon^{-2})$ factor. In this paper, we improve this bound to $\tilde O(\min(k^{1.5} \cdot \varepsilon^{-2},k\cdot \varepsilon^{-4}))$. In the regime where $k \leq \varepsilon^{-2}$, this is a strict improvement over the state of the art. In particular, ours is the first provable bound that breaks through the $k^2$ barrier while retaining an optimal dependency on $\varepsilon$.