Model Selection With Lasso-Zero: Adding Straw to the Haystack to Better Find Needles
From MaRDI portal
Publication:5066436
DOI10.1080/10618600.2020.1869026OpenAlexW3125572213MaRDI QIDQ5066436
Pascaline Descloux, Sylvain Sardy
Publication date: 29 March 2022
Published in: Journal of Computational and Graphical Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1805.05133
Related Items (3)
Aggregated hold out for sparse linear regression with a robust loss function ⋮ A Critical Review of LASSO and Its Derivatives for Variable Selection Under Dependence Among Covariates ⋮ A phase transition for finding needles in nonlinear haystacks with LASSO artificial neural networks
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- Nonlinear total variation based noise removal algorithms
- Quantile universal threshold
- The Adaptive Lasso and Its Oracle Properties
- A mathematical introduction to compressive sensing
- Wild binary segmentation for multiple change-point detection
- Statistics for high-dimensional data. Methods, theory and applications.
- Controlling the false discovery rate via knockoffs
- SLOPE-adaptive variable selection via convex optimization
- Estimation of the mean of a multivariate normal distribution
- Estimating the dimension of a model
- False discoveries occur early on the Lasso path
- Adaptive estimation of a quadratic functional by model selection.
- Nonconcave penalized likelihood with a diverging number of parameters.
- The adaptive and the thresholded Lasso for potentially misspecified models (and a lower bound for the Lasso)
- Rejoinder: ``Best subset, forward stepwise or Lasso? Analysis and recommendations based on extensive comparisons
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Atomic Decomposition by Basis Pursuit
- Non-asymptotic theory of random matrices: extreme singular values
- A study of error variance estimation in Lasso regression
- On Sparse Representations in Arbitrary Redundant Bases
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Stable recovery of sparse overcomplete representations in the presence of noise
- Sparse representations in unions of bases
- Uncertainty principles and ideal atomic decomposition
- Stability Selection
- Variance Estimation Using Refitted Cross-Validation in Ultrahigh Dimensional Regression
- A generalized uncertainty principle and sparse representation in pairs of bases
- Sparse Approximate Solutions to Linear Systems
- Panning for Gold: ‘Model-X’ Knockoffs for High Dimensional Controlled Variable Selection
- Thresholded Basis Pursuit: LP Algorithm for Order-Wise Optimal Support Recovery for Sparse and Approximately Sparse Signals From Noisy Random Measurements
- Model Selection and Estimation in Regression with Grouped Variables
- Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ 1 minimization
- For most large underdetermined systems of linear equations the minimal 𝓁1‐norm solution is also the sparsest solution
- Stable signal recovery from incomplete and inaccurate measurements
- Tuning Parameter Selection in High Dimensional Penalized Likelihood
This page was built for publication: Model Selection With Lasso-Zero: Adding Straw to the Haystack to Better Find Needles