The Loss Rank Criterion for Variable Selection in Linear Regression Analysis
From MaRDI portal
Publication:2911677
DOI10.1111/j.1467-9469.2011.00732.xzbMath1246.62162arXiv1011.1373OpenAlexW3123011258MaRDI QIDQ2911677
Publication date: 1 September 2012
Published in: Scandinavian Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1011.1373
Related Items
Shrinkage, pretest, and penalty estimators in generalized linear models ⋮ Application of shrinkage estimation in linear regression models with autoregressive errors
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- On the distribution of penalized maximum likelihood estimators: the LASSO, SCAD, and thresholding
- Model selection with the loss rank principle
- Smoothing noisy data with spline functions: Estimating the correct degree of smoothing by the method of generalized cross-validation
- Estimating the dimension of a model
- Least angle regression. (With discussion)
- On the ``degrees of freedom of the lasso
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Testing the order of a model
- High-dimensional graphs and variable selection with the Lasso
- Can the strengths of AIC and BIC be shared? A conflict between model indentification and regression estimation
- Penalized Maximum Likelihood Principle for Choosing Ridge Parameter
- Invariance principles for sums of Banach space valued random elements and empirical processes
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Rademacher penalties and structural risk minimization
- Model Selection and Multimodel Inference
- Tuning parameter selectors for the smoothly clipped absolute deviation method
- Model selection and error estimation