Comment: Feature Screening and Variable Selection via Iterative Ridge Regression
From MaRDI portal
Publication:6636561
DOI10.1080/00401706.2020.1801256MaRDI QIDQ6636561
Publication date: 12 November 2024
Published in: Technometrics (Search for Journal in Brave)
Cites Work
- Unnamed Item
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- Support recovery without incoherence: a case for nonconvex regularization
- SLOPE-adaptive variable selection via convex optimization
- High-dimensional classification using features annealed independence rules
- Optimal filtering of square-integrable signals in Gaussian noise
- Lasso, fractional norm and structured sparse estimation using a Hadamard product parametrization
- Nonconcave penalized likelihood with a diverging number of parameters.
- Statistical consistency and asymptotic normality for high-dimensional robust \(M\)-estimators
- Bridging convex and nonconvex optimization in robust PCA: noise, outliers and missing data
- Variable selection using MM algorithms
- Ideal spatial adaptation by wavelet shrinkage
- Regularization of Wavelet Approximations
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- A Statistical View of Some Chemometrics Regression Tools
- Statistical Foundations of Data Science
- Inference and uncertainty quantification for noisy matrix completion
- Nonconcave Penalized Likelihood With NP-Dimensionality
- Regularization and Variable Selection Via the Elastic Net
- Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
This page was built for publication: Comment: Feature Screening and Variable Selection via Iterative Ridge Regression