Graphical lasso

From HandWiki
Revision as of 18:31, 6 February 2024 by John Stpola (talk | contribs) (simplify)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

In statistics, the graphical lasso[1] is a sparse penalized maximum likelihood estimator for the concentration or precision matrix (inverse of covariance matrix) of a multivariate elliptical distribution. The original variant was formulated to solve Dempster's covariance selection problem[2][3] for the multivariate Gaussian distribution when observations were limited. Subsequently, the optimization algorithms to solve this problem were improved[4] and extended[5] to other types of estimators and distributions.

Setting

Consider observations [math]\displaystyle{ X_1, X_2, \ldots, X_n }[/math] from multivariate Gaussian distribution [math]\displaystyle{ X \sim N(0, \Sigma) }[/math]. We are interested in estimating the precision matrix [math]\displaystyle{ \Theta = \Sigma^{-1} }[/math].

The graphical lasso estimator is the [math]\displaystyle{ \hat{\Theta} }[/math] such that:

[math]\displaystyle{ \hat{\Theta} = \operatorname{argmin}_{\Theta \ge 0} \left(\operatorname{tr}(S \Theta) - \log \det(\Theta) + \lambda \sum_{j \ne k} |\Theta_{jk}| \right) }[/math]

where [math]\displaystyle{ S }[/math] is the sample covariance, and [math]\displaystyle{ \lambda }[/math] is the penalizing parameter.[4]

Application

To obtain the estimator in programs, users could use the R package glasso,[6] GraphicalLasso() class in the scikit-learn Python library,[7] or the skggm Python package[8] (similar to scikit-learn).

See also

References

  1. Friedman, Jerome; Hastie, Trevor; Tibshirani, Robert (2008-07-01). "Sparse inverse covariance estimation with the graphical lasso" (in en). Biostatistics 9 (3): 432–441. doi:10.1093/biostatistics/kxm045. ISSN 1465-4644. PMID 18079126. 
  2. Dempster, A. P. (1972). "Covariance Selection". Biometrics 28 (1): 157–175. doi:10.2307/2528966. ISSN 0006-341X. 
  3. Banerjee, Onureena; d'Aspremont, Alexandre; Ghaoui, Laurent El (2005-06-08). "Sparse Covariance Selection via Robust Maximum Likelihood Estimation". arXiv:cs/0506023.
  4. 4.0 4.1 Friedman, Jerome and Hastie, Trevor and Tibshirani, Robert (2008). "Sparse inverse covariance estimation with the graphical lasso". Biostatistics (Biometrika Trust) 9 (3): 432–41. doi:10.1093/biostatistics/kxm045. PMID 18079126. PMC 3019769. http://statweb.stanford.edu/~tibs/ftp/graph.pdf. 
  5. Cai, T. Tony; Liu, Weidong; Zhou, Harrison H. (April 2016). "Estimating sparse precision matrix: Optimal rates of convergence and adaptive estimation". The Annals of Statistics 44 (2): 455–488. doi:10.1214/13-AOS1171. ISSN 0090-5364. 
  6. Jerome Friedman; Trevor Hastie; Rob Tibshirani (2014). glasso: Graphical lasso- estimation of Gaussian graphical models. https://cran.r-project.org/package=glasso. 
  7. Pedregosa, F. and Varoquaux, G. and Gramfort, A. and Michel, V. and Thirion, B. and Grisel, O. and Blondel, M. and Prettenhofer, P. and Weiss, R. and Dubourg, V. and Vanderplas, J. and Passos, A. and Cournapeau, D. and Brucher, M. and Perrot, M. and Duchesnay, E. (2011). "Scikit-learn: Machine Learning in Python". Journal of Machine Learning Research. Bibcode2012arXiv1201.0490P. http://scikit-learn.org/stable/about.html. 
  8. Jason Laska; Manjari Narayan (2017). "skggm 0.2.7: A scikit-learn compatible package for Gaussian and related Graphical Models". Zenodo. doi:10.5281/zenodo.830033. Bibcode2017zndo....830033L.