A robust and efficient variable selection method for linear regression
From MaRDI portal
Publication:5044675
DOI10.1080/02664763.2021.1962259OpenAlexW3190056856MaRDI QIDQ5044675
Zhuoran Yang, Yunlu Jiang, You-Gan Wang, Zhixiong Dong, Liya Fu
Publication date: 2 November 2022
Published in: Journal of Applied Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/02664763.2021.1962259
Ridge regression; shrinkage estimators (Lasso) (62J07) Linear regression; mixed models (62J05) Applications of statistics (62Pxx)
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- The Adaptive Lasso and Its Oracle Properties
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Consistent tuning parameter selection in high dimensional sparse linear regression
- High breakdown-point and high efficiency robust estimates for regression
- A coordinate gradient descent method for nonsmooth separable minimization
- Bahadur representations for robust scale estimators based on regression residuals
- Robust nonnegative garrote variable selection in linear regression
- Asymptotics for Lasso-type estimators.
- On the asymptotics of constrained \(M\)-estimation
- Extended Bayesian information criteria for model selection with large model spaces
- Rank-based variable selection
- Weighted Wilcoxon‐Type Smoothly Clipped Absolute Deviation Method
- On Estimating Variances of Robust Estimators when the Errors are Asymmetric
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Robust Variable Selection With Exponential Squared Loss