Skip to yearly menu bar Skip to main content


Poster

Model Selection in Gaussian Graphical Models: High-Dimensional Consistency of \ell_1-regularizedMLE

Pradeep Ravikumar · Garvesh Raskutti · Martin J Wainwright · Bin Yu


Abstract: We study the performance of the $\ell_1$-regularized maximum likelihood estimator for the problem of estimating the graph structure associated with a Gaussian Markov random field (GMRF). We consider the performance of the estimator in the high-dimensional setting, where the number of nodes in the graph $p$, the number of edges in the graph $s$ and the maximum node degree $d$, are allowed to grow as a function of the number of samples $n$. Our main result provides sufficient conditions on the quadruple $(n,p,s,d)$ for the $\ell_1$-regularized MLE estimator to recover all the edges of the graph with high probability. Under some conditions on the model covariance, we show that model selection can be achieved for sample sizes $n = \Omega(\max\{s,d^{2}\}\log(p))$, with the error decaying as $O(\exp(-c\log(p)))$ for some constant $c$. Empirical simulations suggest that the rate is tight for graphs where the maximum node degree $d$ scales linearly with $p$, but could be tightened for graphs with $d = O(1)$.

Live content is unavailable. Log in and register to view live content