Regularization, sparse recovery, and median-of-means tournaments
From MaRDI portal
Publication:2419670
DOI10.3150/18-BEJ1046zbMath1467.62131arXiv1701.04112OpenAlexW2963290896MaRDI QIDQ2419670
Shahar Mendelson, Gábor Lugosi
Publication date: 14 June 2019
Published in: Bernoulli (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1701.04112
Related Items
Unnamed Item, Robust statistical learning with Lipschitz and convex loss functions, Byzantine-robust distributed sparse learning for \(M\)-estimation, Regularization, sparse recovery, and median-of-means tournaments, Unnamed Item, Robust machine learning by median-of-means: theory and practice, Robust classification via MOM minimization, Iteratively reweighted \(\ell_1\)-penalized robust regression, Robust \(k\)-means clustering for distributions with two moments, Convergence rates of least squares regression estimators with heavy-tailed errors, Efficient learning with robust gradient descent, Scale calibration for high-dimensional robust regression, Mean estimation and regression under heavy-tailed distributions: A survey, Unnamed Item
Cites Work
- Unnamed Item
- Unnamed Item
- Geometric median and robust estimation in Banach spaces
- Sparse recovery under weak moment assumptions
- Robust linear least squares regression
- Empirical risk minimization for heavy-tailed losses
- On spatially adaptive estimation of nonparametric regression
- ``Local vs. ``global parameters -- breaking the Gaussian complexity barrier
- Sub-Gaussian estimators of the mean of a random vector
- Regularization and the small-ball method. I: Sparse recovery
- Slope meets Lasso: improved oracle bounds and optimality
- Learning from MOM's principles: Le Cam's approach
- On aggregation for heavy-tailed classes
- Regularization, sparse recovery, and median-of-means tournaments
- Learning without Concentration
- An Unrestricted Learning Procedure
- On Multiplier Processes Under Weak Moment Assumptions