Optimal regression parameter-specific shrinkage by plug-in estimation
From MaRDI portal
Publication:5077520
DOI10.1080/03610926.2019.1602649OpenAlexW2942295702WikidataQ127977903 ScholiaQ127977903MaRDI QIDQ5077520
Publication date: 18 May 2022
Published in: Communications in Statistics - Theory and Methods (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/03610926.2019.1602649
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- The Adaptive Lasso and Its Oracle Properties
- Estimating the dimension of a model
- A flexible shrinkage operator for fussy grouped variable selection
- Asymptotics for Lasso-type estimators.
- Nonconcave penalized likelihood with a diverging number of parameters.
- Least angle regression. (With discussion)
- Shrinkage averaging estimation
- Convergence and sparsity of Lasso and group Lasso in high-dimensional generalized linear models
- Extended BIC for small-n-large-P sparse GLM
- Shrinkage Tuning Parameter Selection with a Diverging number of Parameters
- Better Subset Regression Using the Nonnegative Garrote
- Ridge Regression and James-Stein Estimation: Review and Comments
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Shrinkage and Penalized Likelihood as Methods to Improve Predictive Accuracy
- Sparsity and Smoothness Via the Fused Lasso
- Application of shrinkage estimation in linear regression models with autoregressive errors
- Regularization and Variable Selection Via the Elastic Net
- Model Selection and Estimation in Regression with Grouped Variables
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
- The Risk of James–Stein and Lasso Shrinkage
This page was built for publication: Optimal regression parameter-specific shrinkage by plug-in estimation