Timezone: »
Poster
Average Sensitivity of Euclidean k-Clustering
Yuichi Yoshida · Shinji Ito
Given a set of $n$ points in $\mathbb{R}^d$, the goal of Euclidean $(k,\ell)$-clustering is to find $k$ centers that minimize the sum of the $\ell$-th powers of the Euclidean distance of each point to the closest center. In practical situations, the clustering result must be stable against points missing in the input data so that we can make trustworthy and consistent decisions. To address this issue, we consider the average sensitivity of Euclidean $(k,\ell)$-clustering, which measures the stability of the output in total variation distance against deleting a random point from the input data. We first show that a popular algorithm \textsc{$k$-means++} and its variant called \textsc{$D^\ell$-sampling} have low average sensitivity. Next, we show that any approximation algorithm for Euclidean $(k,\ell)$-clustering can be transformed to an algorithm with low average sensitivity while almost preserving the approximation guarantee. As byproducts of our results, we provide several algorithms for consistent $(k,\ell)$-clustering and dynamic $(k,\ell)$-clustering in the random-order model, where the input points are randomly permuted and given in an online manner. The goal of the consistent setting is to maintain a good solution while minimizing the number of changes to the solution during the process, and that of the dynamic setting is to maintain a good solution while minimizing the (amortized) update time.
Author Information
Yuichi Yoshida (National Institute of Informatics and Preferred Infrastructure, Inc.)
Shinji Ito (NEC Corporation)
More from the Same Authors
-
2022 Poster: Single Loop Gaussian Homotopy Method for Non-convex Optimization »
Hidenori Iwakiri · Yuhang Wang · Shinji Ito · Akiko Takeda -
2022 Poster: Nearly Optimal Best-of-Both-Worlds Algorithms for Online Learning with Feedback Graphs »
Shinji Ito · Taira Tsuchiya · Junya Honda -
2021 Poster: On Optimal Robustness to Adversarial Corruption in Online Decision Problems »
Shinji Ito -
2021 Poster: Hybrid Regret Bounds for Combinatorial Semi-Bandits and Adversarial Linear Bandits »
Shinji Ito -
2020 Poster: Tight First- and Second-Order Regret Bounds for Adversarial Linear Bandits »
Shinji Ito · Shuichi Hirahara · Tasuku Soma · Yuichi Yoshida -
2020 Poster: A Tight Lower Bound and Efficient Reduction for Swap Regret »
Shinji Ito -
2020 Poster: Delay and Cooperation in Nonstochastic Linear Bandits »
Shinji Ito · Daisuke Hatano · Hanna Sumita · Kei Takemura · Takuro Fukunaga · Naonori Kakimura · Ken-Ichi Kawarabayashi -
2020 Spotlight: A Tight Lower Bound and Efficient Reduction for Swap Regret »
Shinji Ito -
2020 Spotlight: Delay and Cooperation in Nonstochastic Linear Bandits »
Shinji Ito · Daisuke Hatano · Hanna Sumita · Kei Takemura · Takuro Fukunaga · Naonori Kakimura · Ken-Ichi Kawarabayashi -
2020 Spotlight: Tight First- and Second-Order Regret Bounds for Adversarial Linear Bandits »
Shinji Ito · Shuichi Hirahara · Tasuku Soma · Yuichi Yoshida -
2019 Poster: Improved Regret Bounds for Bandit Combinatorial Optimization »
Shinji Ito · Daisuke Hatano · Hanna Sumita · Kei Takemura · Takuro Fukunaga · Naonori Kakimura · Ken-Ichi Kawarabayashi -
2019 Poster: Submodular Function Minimization with Noisy Evaluation Oracle »
Shinji Ito -
2019 Poster: Oracle-Efficient Algorithms for Online Linear Optimization with Bandit Feedback »
Shinji Ito · Daisuke Hatano · Hanna Sumita · Kei Takemura · Takuro Fukunaga · Naonori Kakimura · Ken-Ichi Kawarabayashi -
2017 Poster: Fitting Low-Rank Tensors in Constant Time »
Kohei Hayashi · Yuichi Yoshida -
2017 Spotlight: Fitting Low-Rank Tensors in Constant Time »
Kohei Hayashi · Yuichi Yoshida -
2016 Poster: Minimizing Quadratic Functions in Constant Time »
Kohei Hayashi · Yuichi Yoshida -
2015 Poster: A Generalization of Submodular Cover via the Diminishing Return Property on the Integer Lattice »
Tasuku Soma · Yuichi Yoshida -
2015 Poster: Monotone k-Submodular Function Maximization with Size Constraints »
Naoto Ohsaka · Yuichi Yoshida