The Adaptive Gril Estimator with a Diverging Number of Parameters
From MaRDI portal
Publication:2859305
DOI10.1080/03610926.2011.615438zbMath1462.62414arXiv1302.6390OpenAlexW2962796305MaRDI QIDQ2859305
Abdallah Mkhadri, Mohammed El Anbari
Publication date: 7 November 2013
Published in: Communications in Statistics - Theory and Methods (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1302.6390
Related Items (2)
An extended variable inclusion and shrinkage algorithm for correlated variables ⋮ The adaptive BerHu penalty in robust regression
Cites Work
- The Adaptive Lasso and Its Oracle Properties
- Shrinkage and model selection with correlated variables via weighted fusion
- Feature selection guided by structural information
- Asymptotic behavior of M-estimators of p regression parameters when \(p^ 2/n\) is large. I. Consistency
- Asymptotics for Lasso-type estimators.
- Nonconcave penalized likelihood with a diverging number of parameters.
- Least angle regression. (With discussion)
- Sparse regression with exact clustering
- Penalized regression combining the \( L_{1}\) norm and a correlation based penalty
- Simultaneous analysis of Lasso and Dantzig selector
- On the adaptive elastic net with a diverging number of parameters
- Sparsity oracle inequalities for the Lasso
- Atomic Decomposition by Basis Pursuit
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Sparsity and Smoothness Via the Fused Lasso
- Regularization and Variable Selection Via the Elastic Net
- On the Non-Negative Garrotte Estimator
- Tuning parameter selectors for the smoothly clipped absolute deviation method
This page was built for publication: The Adaptive Gril Estimator with a Diverging Number of Parameters