Timezone: »

The LASSO risk: asymptotic results and real world examples
Mohsen Bayati · José Bento · Andrea Montanari

Wed Dec 08 12:00 AM -- 12:00 AM (PST) @

We consider the problem of learning a coefficient vector x0 from noisy linear observation y=Ax0+w. In many contexts (ranging from model selection to image processing) it is desirable to construct a sparse estimator. In this case, a popular approach consists in solving an l1-penalized least squares problem known as the LASSO or BPDN.

For sequences of matrices A of increasing dimensions, with iid gaussian entries, we prove that the normalized risk of the LASSO converges to a limit, and we obtain an explicit expression for this limit. Our result is the first rigorous derivation of an explicit formula for the asymptotic risk of the LASSO for random instances. The proof technique is based on the analysis of AMP, a recently developed efficient algorithm, that is inspired from graphical models ideas.

Through simulations on real data matrices (gene expression data and hospital medical records) we observe that these results can be relevant in a broad array of practical applications.

Author Information

Mohsen Bayati (Stanford University)
José Bento (Boston College)
Andrea Montanari (Stanford)

More from the Same Authors