Robust and tuning-free sparse linear regression via square-root slope
From MaRDI portal
Publication:6583518
DOI10.1137/23m1608690MaRDI QIDQ6583518
Stanislav Minsker, Lang Wang, Mohamed Ndaoud
Publication date: 6 August 2024
Published in: SIAM Journal on Mathematics of Data Science (Search for Journal in Brave)
Ridge regression; shrinkage estimators (Lasso) (62J07) Robustness and adaptive procedures (parametric inference) (62F35)
Cites Work
- Unnamed Item
- Estimation and testing under sparsity. École d'Été de Probabilités de Saint-Flour XLV -- 2015
- High dimensional robust M-estimation: asymptotic variance via approximate message passing
- A general decision theory for Huber's \(\epsilon\)-contamination model
- Adaptive robust estimation in sparse vector model
- SLOPE-adaptive variable selection via convex optimization
- Improved bounds for square-root Lasso and square-root slope
- On the conditions used to prove oracle results for the Lasso
- Slope meets Lasso: improved oracle bounds and optimality
- Robust and efficient mean estimation: an approach based on the properties of self-normalized sums
- Robust machine learning by median-of-means: theory and practice
- Robust regression via mutivariate regression depth
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Robust Lasso With Missing and Grossly Corrupted Observations
- Outlier Detection Using Nonconvex Penalized Regression
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Highly Robust Error Correction byConvex Programming
- Sharp Oracle Inequalities for Square Root Regularization
- Efficient Algorithms and Lower Bounds for Robust Linear Regression
- A Simple Tool for Bounding the Deviation of Random Matrices on Geometric Sets
This page was built for publication: Robust and tuning-free sparse linear regression via square-root slope