Improved bounds for square-root Lasso and square-root slope
From MaRDI portal
Publication:1746538
DOI10.1214/18-EJS1410zbMath1473.62132arXiv1703.02907MaRDI QIDQ1746538
Publication date: 25 April 2018
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1703.02907
Nonparametric regression and quantile regression (62G08) Ridge regression; shrinkage estimators (Lasso) (62J07) Linear regression; mixed models (62J05) Nonparametric estimation (62G05) Minimax procedures in statistical decision theory (62C20)
Related Items (3)
Iterative algorithm for discrete structure recovery ⋮ Estimation of the \(\ell_2\)-norm and testing in sparse linear regression with unknown variance ⋮ Unnamed Item
Cites Work
- Unnamed Item
- SLOPE is adaptive to unknown sparsity and asymptotically minimax
- SLOPE-adaptive variable selection via convex optimization
- Bounds on the prediction error of penalized least squares estimators with convex penalty
- Regularization and the small-ball method. I: Sparse recovery
- Slope meets Lasso: improved oracle bounds and optimality
- Pivotal estimation via square-root lasso in nonparametric regression
- Simultaneous analysis of Lasso and Dantzig selector
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Scaled sparse linear regression
- Sharp Oracle Inequalities for Square Root Regularization
This page was built for publication: Improved bounds for square-root Lasso and square-root slope