Timezone: »
We prove continuity of the solution path for the group lasso, a popular method of computing group- sparse models. Unlike the more classical lasso method, the group lasso solution path is non-linear and cannot be determined in closed form. To circumvent this, we first characterize the group lasso solution set and then show how to construct an implicit function for the min-norm path. We prove our implicit representation is continuous almost everywhere and extend this to continuity every- where when the group lasso solution is unique. Our work can be viewed as extending solution path analyses from lasso setting to the group lasso and implies that grid-search is a sensible approach to hyper-parameter selection. Our results also have applications to convex reformulations of neural neural networks and so are deeply connected to solution paths for shallow neural networks.
Author Information
Aaron Mishkin (Stanford Univeristy)
Mert Pilanci (Stanford University)
More from the Same Authors
-
2022 : Fast Convergence of Greedy 2-Coordinate Updates for Optimizing with an Equality Constraint »
Amrutha Varshini Ramesh · Aaron Mishkin · Mark Schmidt -
2019 Poster: Painless Stochastic Gradient: Interpolation, Line-Search, and Convergence Rates »
Sharan Vaswani · Aaron Mishkin · Issam Laradji · Mark Schmidt · Gauthier Gidel · Simon Lacoste-Julien -
2018 Poster: SLANG: Fast Structured Covariance Approximations for Bayesian Deep Learning with Natural Gradient »
Aaron Mishkin · Frederik Kunstner · Didrik Nielsen · Mark Schmidt · Mohammad Emtiyaz Khan