ERM and RERM are optimal estimators for regression problems when malicious outliers corrupt the labels
From MaRDI portal
Publication:2209821
DOI10.1214/20-EJS1754zbMath1453.62484arXiv1910.10923MaRDI QIDQ2209821
Publication date: 5 November 2020
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1910.10923
Nonparametric regression and quantile regression (62G08) Nonparametric robustness (62G35) Minimax procedures in statistical decision theory (62C20)
Related Items (3)
All-in-one robust estimator of the Gaussian mean ⋮ ERM and RERM are optimal estimators for regression problems when malicious outliers corrupt the labels ⋮ Unnamed Item
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Geometric median and robust estimation in Banach spaces
- Estimation and testing under sparsity. École d'Été de Probabilités de Saint-Flour XLV -- 2015
- A general decision theory for Huber's \(\epsilon\)-contamination model
- On the prediction performance of the Lasso
- Probability in Banach spaces. Isoperimetry and processes
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- Regularization in kernel learning
- Robust m-estimators of multivariate location and scatter
- Asymptotic behavior of M-estimators for the linear model
- Smooth discrimination analysis
- Sub-Gaussian estimators of the mean of a random vector
- Regularization and the small-ball method. I: Sparse recovery
- Robust covariance and scatter matrix estimation under Huber's contamination model
- Optimal aggregation of classifiers in statistical learning.
- On the conditions used to prove oracle results for the Lasso
- Slope meets Lasso: improved oracle bounds and optimality
- Robust statistical learning with Lipschitz and convex loss functions
- Robust machine learning by median-of-means: theory and practice
- Mean estimation with sub-Gaussian rates in polynomial time
- ERM and RERM are optimal estimators for regression problems when malicious outliers corrupt the labels
- Algorithms of robust stochastic optimization based on mirror descent method
- Estimation bounds and sharp oracle inequalities of regularized procedures with Lipschitz loss functions
- Localized Gaussian width of \(M\)-convex hulls with applications to Lasso and convex aggregation
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Optimal rates for the regularized least-squares algorithm
- Empirical minimization
- Local Rademacher complexities
- Learning theory estimates via integral operators and their approximations
- The Future of Data Analysis
- Support Vector Machines
- The Influence Curve and Its Role in Robust Estimation
- Robust Estimators in High-Dimensions Without the Computational Intractability
- High-Dimensional Probability
- 10.1162/1532443041424337
- Efficient Algorithms and Lower Bounds for Robust Linear Regression
- High-Dimensional Robust Mean Estimation in Nearly-Linear Time
- Upper and Lower Bounds for Stochastic Processes
- A General Qualitative Definition of Robustness
This page was built for publication: ERM and RERM are optimal estimators for regression problems when malicious outliers corrupt the labels