A discussion on practical considerations with sparse regression methodologies
From MaRDI portal
Publication:2225315
DOI10.1214/20-STS806MaRDI QIDQ2225315
Owais Sarwar, Nikolaos V. Sahinidis, Benjamin Sauk
Publication date: 8 February 2021
Published in: Statistical Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2011.09362
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- Nearly unbiased variable selection under minimax concave penalty
- The Adaptive Lasso and Its Oracle Properties
- Best subset selection via a modern optimization lens
- Random lasso
- Support recovery without incoherence: a case for nonconvex regularization
- Relaxed Lasso
- Sparse high-dimensional regression: exact scalable algorithms and phase transitions
- High-dimensional regression in practice: an empirical study of finite-sample prediction, variable selection and ranking
- Sparse learning via Boolean relaxations
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- SparseNet: Coordinate Descent With Nonconvex Penalties
- Better Subset Regression Using the Nonnegative Garrote
- Observed universality of phase transitions in high-dimensional geometry, with implications for modern data analysis and signal processing
- Regressions by Leaps and Bounds
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Stability Selection
- A Statistical View of Some Chemometrics Regression Tools
- Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Regularization and Variable Selection Via the Elastic Net
- Ridge Regression: Biased Estimation for Nonorthogonal Problems