Timezone: »
A recently introduced technique, called safe screening,'' for a sparse optimization problem allows us to identify irrelevant variables in the early stages of optimization. In this paper, we first propose a flexible framework for safe screening based on the Fenchel--Rockafellar duality and then derive a strong safe screening rule for norm-regularized least squares using the proposed framework. We refer to the proposed screening rule for norm-regularized least squares as
dynamic Sasvi'' because it can be interpreted as a generalization of Sasvi. Unlike the original Sasvi, it does not require the exact solution of a more strongly regularized problem; hence, it works safely in practice. We show that our screening rule always eliminates more features compared with the existing state-of-the-art methods.
Author Information
Hiroaki Yamada (Kyoto University)
Makoto Yamada (Kyoto University / RIKEN AIP)
More from the Same Authors
-
2021 Poster: Adversarial Regression with Doubly Non-negative Weighting Matrices »
Tam Le · Truyen Nguyen · Makoto Yamada · Jose Blanchet · Viet Anh Nguyen -
2019 Poster: Kernel Stein Tests for Multiple Model Comparison »
Jen Ning Lim · Makoto Yamada · Bernhard Schölkopf · Wittawat Jitkrittum -
2019 Poster: Tree-Sliced Variants of Wasserstein Distances »
Tam Le · Makoto Yamada · Kenji Fukumizu · Marco Cuturi -
2018 Poster: Persistence Fisher Kernel: A Riemannian Manifold Kernel for Persistence Diagrams »
Tam Le · Makoto Yamada