On least squares estimation under heteroscedastic and heavy-tailed errors
From MaRDI portal
Publication:2119229
DOI10.1214/21-AOS2105zbMath1486.62113arXiv1909.02088OpenAlexW2971447570MaRDI QIDQ2119229
Arun Kumar Kuchibhotla, Rohit Kumar Patra
Publication date: 23 March 2022
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1909.02088
heavy tailsmaximal inequalityinterpolation inequalitydyadic peelingfinite sample tail probability boundslocal envelopes
Nonparametric regression and quantile regression (62G08) Asymptotic properties of nonparametric inference (62G20) Inequalities; stochastic orderings (60E15)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Performance of empirical risk minimization in linear aggregation
- Upper bounds on product and multiplier empirical processes
- On higher order isotropy conditions and lower bounds for sparse quadratic forms
- Empirical entropy, minimax regret and minimax risk
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- Linear smoothers and additive models
- On singular solutions of nonlinear elliptic and parabolic equations
- Robust linear least squares regression
- Global risk bounds and adaptation in univariate convex regression
- Risk bounds for statistical learning
- Empirical risk minimization for heavy-tailed losses
- The Grenander estimator: A nonasymptotic approach
- Estimating a regression function
- Minimum contrast estimators on sieves: Exponential bounds and rates of convergence
- Rates of convergence for minimum contrast estimators
- The use of polynomial splines and their tensor products in multivariate function estimation. (With discussion)
- Convergence rate of sieve estimates
- Consistency for the least squares estimator in nonparametric regression
- Information-theoretic determination of minimax rates of convergence
- ``Local vs. ``global parameters -- breaking the Gaussian complexity barrier
- On concentration for (regularized) empirical risk minimization
- On univariate convex regression
- Adaptive risk bounds in unimodal regression
- Nonparametric shape-restricted regression
- Sharp oracle inequalities for least squares estimators in shape restricted regression
- A distribution-free theory of nonparametric regression
- Structure adaptive approach for dimension reduction.
- Risk bounds in isotonic regression
- Weak convergence and empirical processes. With applications to statistics
- A local maximal inequality under uniform entropy
- Bracketing numbers of convex and \(m\)-monotone functions on polytopes
- On estimation of isotonic piecewise constant signals
- Robust machine learning by median-of-means: theory and practice
- Risk minimization by median-of-means tournaments
- Convergence rates of least squares regression estimators with heavy-tailed errors
- Isotonic regression in general dimensions
- Comparison and anti-concentration bounds for maxima of Gaussian random vectors
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- On weakly bounded empirical processes
- Concentration inequalities and asymptotic results for ratio type empirical processes
- On risk bounds in isotonic and other shape restricted regression problems
- Tail bounds via generic chaining
- Majorizing measures: The generic chaining
- Bracketing metric entropy rates and empirical central limit theorems for function classes of Besov- and Sobolev-type
- Learning without Concentration
- Concentration Inequalities
- Nemirovski's Inequalities Revisited
- Mathematical Foundations of Infinite-Dimensional Statistical Models
- FAST RATES FOR ESTIMATION ERROR AND ORACLE INEQUALITIES FOR MODEL SELECTION
- Central limit theorems and weak laws of large numbers in certain banach spaces
- Asymptotic Statistics
- Sieve Extremum Estimates for Weakly Dependent Data
- An Unrestricted Learning Procedure
- Upper and Lower Bounds for Stochastic Processes
- Around Nemirovski’s inequality
- Convergence of stochastic processes
This page was built for publication: On least squares estimation under heteroscedastic and heavy-tailed errors