Simple expressions of the LASSO and SLOPE estimators in low-dimension
From MaRDI portal
Publication:5222210
DOI10.1080/02331888.2020.1720019zbMath1435.62275OpenAlexW2924077917MaRDI QIDQ5222210
Rémi Servien, Didier Concordet, Patrick J. C. Tardivel
Publication date: 1 April 2020
Published in: Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/02331888.2020.1720019
Related Items (2)
Proximal operator for the sorted \(\ell_1\) norm: application to testing procedures based on SLOPE ⋮ Pattern recovery and signal denoising by SLOPE when the design matrix is orthogonal
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- Familywise error rate control via knockoffs
- SLOPE is adaptive to unknown sparsity and asymptotically minimax
- Statistics for high-dimensional data. Methods, theory and applications.
- Controlling the false discovery rate via knockoffs
- SLOPE-adaptive variable selection via convex optimization
- On the distribution, model selection properties and uniqueness of the Lasso estimator in low and high dimensions
- A significance test for the lasso
- On Lasso refitting strategies
- High-dimensional graphs and variable selection with the Lasso
- Selective inference with unknown variance via the square-root lasso
- On Sparse Vector Recovery Performance in Structurally Orthogonal Matrices via LASSO
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Sequential Selection Procedures and False Discovery Rate Control
- On the sign recovery by least absolute shrinkage and selection operator, thresholded least absolute shrinkage and selection operator, and thresholded basis pursuit denoising
This page was built for publication: Simple expressions of the LASSO and SLOPE estimators in low-dimension