Timezone: »
Poster
Sparse Support Recovery with Non-smooth Loss Functions
Kévin Degraux · Gabriel Peyré · Jalal Fadili · Laurent Jacques
In this paper, we study the support recovery guarantees of underdetermined sparse regression using the $\ell_1$-norm as a regularizer and a non-smooth loss function for data fidelity. More precisely, we focus in detail on the cases of $\ell_1$ and $\ell_\infty$ losses, and contrast them with the usual $\ell_2$ loss.While these losses are routinely used to account for either sparse ($\ell_1$ loss) or uniform ($\ell_\infty$ loss) noise models, a theoretical analysis of their performance is still lacking. In this article, we extend the existing theory from the smooth $\ell_2$ case to these non-smooth cases. We derive a sharp condition which ensures that the support of the vector to recover is stable to small additive noise in the observations, as long as the loss constraint size is tuned proportionally to the noise level. A distinctive feature of our theory is that it also explains what happens when the support is unstable. While the support is not stable anymore, we identify an "extended support" and show that this extended support is stable to small additive noise. To exemplify the usefulness of our theory, we give a detailed numerical analysis of the support stability/instability of compressed sensing recovery with these different losses. This highlights different parameter regimes, ranging from total support stability to progressively increasing support instability.
Author Information
Kévin Degraux (Université catholique de Louva)
Gabriel Peyré (Université Paris Dauphine)
Jalal Fadili (CNRS-ENSICAEN-Univ. Caen)
Laurent Jacques (Université catholique de Louvain)
More from the Same Authors
-
2021 : Faster Unbalanced Optimal Transport: Translation invariant Sinkhorn and 1-D Frank-Wolfe »
Thibault Sejourne · Francois-Xavier Vialard · Gabriel Peyré -
2021 : Faster Unbalanced Optimal Transport: Translation invariant Sinkhorn and 1-D Frank-Wolfe »
Thibault Sejourne · Francois-Xavier Vialard · Gabriel Peyré -
2021 : Linear-Time Gromov Wasserstein Distances using Low Rank Couplings and Costs »
Meyer Scetbon · Gabriel Peyré · Marco Cuturi -
2021 : Linear-Time Gromov Wasserstein Distances using Low Rank Couplings and Costs »
Meyer Scetbon · Gabriel Peyré · Marco Cuturi -
2021 Poster: Smooth Bilevel Programming for Sparse Regularization »
Clarice Poon · Gabriel Peyré -
2021 Poster: The Unbalanced Gromov Wasserstein Distance: Conic Formulation and Relaxation »
Thibault Sejourne · Francois-Xavier Vialard · Gabriel Peyré -
2017 Workshop: Optimal Transport and Machine Learning »
Olivier Bousquet · Marco Cuturi · Gabriel Peyré · Fei Sha · Justin Solomon -
2016 Poster: A Multi-step Inertial Forward-Backward Splitting Method for Non-convex Optimization »
Jingwei Liang · Jalal Fadili · Gabriel Peyré -
2016 Poster: Stochastic Optimization for Large-scale Optimal Transport »
Aude Genevay · Marco Cuturi · Gabriel Peyré · Francis Bach -
2015 Poster: Biologically Inspired Dynamic Textures for Probing Motion Perception »
Jonathan Vacher · Andrew Isaac Meso · Laurent U Perrinet · Gabriel Peyré -
2015 Spotlight: Biologically Inspired Dynamic Textures for Probing Motion Perception »
Jonathan Vacher · Andrew Isaac Meso · Laurent U Perrinet · Gabriel Peyré -
2014 Workshop: Optimal Transport and Machine Learning »
Marco Cuturi · Gabriel Peyré · Justin Solomon · Alexander Barvinok · Piotr Indyk · Robert McCann · Adam Oberman -
2014 Poster: Local Linear Convergence of Forward--Backward under Partial Smoothness »
Jingwei Liang · Jalal Fadili · Gabriel Peyré -
2012 Poster: A quasi-Newton proximal splitting method »
Stephen Becker · Jalal Fadili -
2012 Spotlight: A quasi-Newton proximal splitting method »
Stephen Becker · Jalal Fadili