Shrinkage tuning parameter selection in precision matrices estimation
From MaRDI portal
Publication:538141
DOI10.1016/j.jspi.2011.03.008zbMath1213.62099arXiv0909.1123OpenAlexW1556261580MaRDI QIDQ538141
Publication date: 23 May 2011
Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0909.1123
Lua error in Module:PublicationMSCList at line 37: attempt to index local 'msc_result' (a nil value).
Related Items (8)
Ridge estimation of inverse covariance matrices from high-dimensional data ⋮ Group-wise shrinkage estimation in penalized model-based clustering ⋮ Partial correlation graphical LASSO ⋮ MARS as an alternative approach of Gaussian graphical model for biochemical networks ⋮ A computationally fast alternative to cross-validation in penalized Gaussian graphical models ⋮ The spectral condition number plot for regularization parameter evaluation ⋮ Selecting the tuning parameter in penalized Gaussian graphical models ⋮ Estimation of a sparse and spiked covariance matrix
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Sparse inverse covariance estimation with the graphical lasso
- The Adaptive Lasso and Its Oracle Properties
- Covariance regularization by thresholding
- Sparsistency and rates of convergence in large covariance matrix estimation
- Smoothing noisy data with spline functions: Estimating the correct degree of smoothing by the method of generalized cross-validation
- Nonconcave penalized likelihood with a diverging number of parameters.
- Network exploration via the adaptive LASSO and SCAD penalties
- High-dimensional graphs and variable selection with the Lasso
- Shrinkage Tuning Parameter Selection with a Diverging number of Parameters
- Gradient directed regularization for sparse Gaussian concentration graphs, with applications to inference of genetic networks
- Model selection and estimation in the Gaussian graphical model
- Can the strengths of AIC and BIC be shared? A conflict between model indentification and regression estimation
- Unified LASSO Estimation by Least Squares Approximation
- Asymptotic Statistics
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Tuning parameter selectors for the smoothly clipped absolute deviation method
This page was built for publication: Shrinkage tuning parameter selection in precision matrices estimation