Poster
High-dimensional union support recovery in multivariate regression
Guillaume R Obozinski · Martin J Wainwright · Michael Jordan
We study the behavior of block L1/L2 regularization for multivariate regression, where a K-dimensional response vector is regressed upon a fixed set of p covariates. We are interested in sparse multivariate regression; specifically, our goal is to recover the subset of covariates that are active in at least one of the regression problems. We refer to this as the union support recovery problem. Studying this problem under high-dimensional scaling (where the number p of problem parameters as well as sample size n tend to infinity simultaneously), our main result is to show that exact recovery is possible once the sample complexity parameter given by \theta(n; p; s,B) := n/(psi(B) log(p-s)) exceeds a critical threshold. Here n is the sample size, p is the ambient dimension of the regression model, s is the size of the union of supports, and psi(B*) is a sparsity-overlap function that measures a combination of the sparsities and overlaps of the K-regression coefficient vectors that constitute the model. This sparsity-overlap function reveals that under some structural assumptions, block L1/L2 regularization for multivariate regression never harms performance relative to a naive L1-approach, and can yield substantial improvements in sample complexity (up to a factor of K) when the regression vectors are suitably orthogonal. Although our statements are made in an asymptotic form for clarity, the analysis itself is non-asymptotic in nature, yielding explicit bounds on loss for finite choices of the triple (n; p; s). We complement our theoretical results with simulations that demonstrate the sharpness of the result, even for relatively small problems (e.g.,p = 32).
Live content is unavailable. Log in and register to view live content