Learning without concentration for general loss functions
From MaRDI portal
Publication:1647935
DOI10.1007/s00440-017-0784-yzbMath1393.62038arXiv1410.3192OpenAlexW2962896218MaRDI QIDQ1647935
Publication date: 27 June 2018
Published in: Probability Theory and Related Fields (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1410.3192
Inference from stochastic processes and prediction (62M20) Nonparametric estimation (62G05) Learning and adaptive systems in artificial intelligence (68T05) Prediction theory (aspects of stochastic processes) (60G25)
Related Items
On Multiplier Processes Under Weak Moment Assumptions, On aggregation for heavy-tailed classes, Aggregated hold out for sparse linear regression with a robust loss function, Learning without concentration for general loss functions, Upper bounds on product and multiplier empirical processes, Generic error bounds for the generalized Lasso with sub-exponential data, Mean estimation in high dimension, On the Geometry of Random Polytopes, Regularization and the small-ball method. I: Sparse recovery, Phase retrieval with PhaseLift algorithm, Solving equations of random convex functions via anchored regression, Mean estimation and regression under heavy-tailed distributions: A survey, Tractable Bayesian Variable Selection: Beyond Normality
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Upper bounds on product and multiplier empirical processes
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- Some limit theorems for empirical processes (with discussion)
- Obtaining fast error rates in nonconvex situations
- Asymptotic theory of finite dimensional normed spaces. With an appendix by M. Gromov: Isoperimetric inequalities in Riemannian manifolds
- Rates of convergence for minimum contrast estimators
- Learning without concentration for general loss functions
- Weak convergence and empirical processes. With applications to statistics
- On aggregation for heavy-tailed classes
- Local Rademacher complexities
- Learning without Concentration
- Uniform Central Limit Theorems
- Improving the sample complexity using global data