One-step sparse ridge estimation with folded concave penalty
From MaRDI portal
Publication:6590301
DOI10.3934/mfc.2022048MaRDI QIDQ6590301
Publication date: 21 August 2024
Published in: Mathematical Foundations of Computing (Search for Journal in Brave)
Ridge regression; shrinkage estimators (Lasso) (62J07) Generalized linear models (logistic models) (62J12)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- Nearly unbiased variable selection under minimax concave penalty
- The Adaptive Lasso and Its Oracle Properties
- Rejoinder: One-step sparse estimates in nonconcave penalized likelihood models
- Nonconcave penalized likelihood with a diverging number of parameters.
- On the adaptive elastic net with a diverging number of parameters
- Variable selection via combined penalization for high-dimensional data analysis
- Pathwise coordinate optimization
- Coordinate descent algorithms for lasso penalized regression
- High-dimensional graphs and variable selection with the Lasso
- Strong oracle optimality of folded concave penalized estimation
- Variable selection using MM algorithms
- SparseNet: Coordinate Descent With Nonconvex Penalties
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Variable selection for high-dimensional generalized linear models with the weighted elastic-net procedure
- Group variable selection via SCAD-L2
- Nonconcave Penalized Likelihood With NP-Dimensionality
- Regularization and Variable Selection Via the Elastic Net
- Tuning parameter selectors for the smoothly clipped absolute deviation method
- The Mnet method for variable selection
This page was built for publication: One-step sparse ridge estimation with folded concave penalty
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6590301)