A universal procedure for aggregating estimators
From MaRDI portal
Publication:1002171
DOI10.1214/00-AOS576zbMath1155.62018arXiv0704.2500MaRDI QIDQ1002171
Publication date: 25 February 2009
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0704.2500
Nonparametric regression and quantile regression (62G08) Asymptotic properties of nonparametric inference (62G20) Nonparametric estimation (62G05) Markov processes: estimation; hidden Markov models (62M05)
Related Items
Hypothesis testing via affine detectors, From local kernel to nonlocal multiple-model image denoising, Mixing partially linear regression models, Average estimation of semiparametric models for high-dimensional longitudinal data, Frequentist model averaging for linear mixed-effects models, Theory of adaptive estimation, Mirror averaging with sparsity priors, A new approach to estimator selection, Model averaging by jackknife criterion in models with dependent data, Estimator selection with respect to Hellinger-type risks, Estimator selection in the Gaussian setting, Jackknife model averaging, Aggregating estimates by convex optimization, Adaptive estimation over anisotropic functional classes via oracle approach
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Learning by mirror averaging
- Structural adaptation via \(\mathbb L_p\)-norm oracle inequalities
- Ordered linear smoothers
- A universally acceptable smoothing factor for kernel density estimates
- Optimal pointwise adaptive methods in nonparametric estimation
- Nonasymptotic universal smoothing factors, kernel complexity and Yatracos classes
- Model selection in nonparametric regression
- Aggregating regression procedures to improve performance
- Reconstruction of sparse vectors in white Gaussian noise
- Mixing strategies for density estimation.
- Functional aggregation for nonparametric regression.
- Oracle inequalities for inverse problems
- Needles and straw in haystacks: Empirical Bayes estimates of possibly sparse sequences
- Statistical learning theory and stochastic optimization. Ecole d'Eté de Probabilitiés de Saint-Flour XXXI -- 2001.
- Aggregated estimators and empirical complexity for least square regression
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Aggregation for Gaussian regression
- Simultaneous adaptation to the margin and to complexity in classification
- Model selection via testing: an alternative to (penalized) maximum likelihood estimators.
- Adapting to unknown sparsity by controlling the false discovery rate
- Adaptive Regression by Mixing
- Estimation and selection procedures in regression: anL1approach
- Learning Theory and Kernel Machines
- Combinatorial methods in density estimation