Estimation and selection procedures in regression: anL1approach
From MaRDI portal
Publication:4546737
DOI10.2307/3316011zbMath0994.62030OpenAlexW2070289385MaRDI QIDQ4546737
Nicolas W. Hengartner, Marten H. Wegkamp
Publication date: 8 October 2002
Published in: Canadian Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.2307/3316011
bandwidth selectionshatter coefficientcovering numbersminimum distance estimatorsfinite sample properties of estimatorslocal linear regression smoothersNadaraya-Watson regression estimatorsVC-graph classes
Related Items
APPLIED REGRESSION ANALYSIS BIBLIOGRAPHY UPDATE 2000–2001, Model selection in nonparametric regression, A note on minimum distance estimation of copula densities, A universal procedure for aggregating estimators, A note on penalized minimum distance estimation in nonparametric regression, Aggregating estimates by convex optimization
Cites Work
- Unnamed Item
- Estimating a regression function
- Rates of convergence of minimum distance estimators and Kolmogorov's entropy
- A universally acceptable smoothing factor for kernel density estimates
- Rates of convergence of estimates, Kolmogorov's entropy and the dimensionality reduction principle in regression
- Nonasymptotic universal smoothing factors, kernel complexity and Yatracos classes
- Model selection in nonparametric regression
- Universal consistency of local polynomial kernel regression estimates
- Weak convergence and empirical processes. With applications to statistics
- Local linear regression smoothers and their minimax efficiencies
- Inequalities for the $r$th Absolute Moment of a Sum of Random Variables, $1 \leqq r \leqq 2$
- Convergence of stochastic processes