Optimal model selection in heteroscedastic regression using piecewise polynomial functions
From MaRDI portal
Publication:1951154
DOI10.1214/13-EJS803zbMath1337.62083arXiv1104.1050MaRDI QIDQ1951154
Publication date: 29 May 2013
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1104.1050
nonparametric regressionrandom designheteroscedastic noiseslope heuristicsoptimal model selectionhold-out penalty
Nonparametric regression and quantile regression (62G08) General nonlinear regression (62J02) Nonparametric statistical resampling methods (62G09)
Related Items (6)
Slope heuristics and V-Fold model selection in heteroscedastic regression using strongly localized bases ⋮ Discussion of ``On concentration for (regularized) empirical risk minimization by Sara van de Geer and Martin Wainwright ⋮ Estimator selection: a new method with applications to kernel density estimation ⋮ Concentration behavior of the penalized least squares estimator ⋮ Sharp oracle inequalities and slope heuristic for specification probabilities estimation in discrete random fields ⋮ Non-parametric Poisson regression from independent and weakly dependent observations by model selection
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Optimal model selection in density estimation
- Optimal model selection for density estimation of stationary data under various mixing condi\-tions
- A high-dimensional Wilks phenomenon
- Slope heuristics: overview and implementation
- Aggregation via empirical risk minimization
- Gaussian model selection with an unknown variance
- A tail inequality for suprema of unbounded empirical processes with applications to Markov chains
- Minimum contrast estimators on sieves: Exponential bounds and rates of convergence
- Risk bounds for model selection via penalization
- Rates of convergence for minimum contrast estimators
- Sharp oracle inequalities for aggregation of affine estimators
- Optimal upper and lower bounds for the true and empirical excess risks in heteroscedastic least-squares regression
- Model selection by resampling penalization
- Minimal penalties for Gaussian model selection
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Aggregation for Gaussian regression
- Statistical predictor identification
- Local Rademacher complexities
- Estimating the Error Rate of a Prediction Rule: Improvement on Cross-Validation
- Information Theory and Mixing Least-Squares Regressions
- Rademacher penalties and structural risk minimization
- Aggregation by Exponential Weighting and Sharp Oracle Inequalities
- Some Comments on C P
- Gaussian model selection
- Model selection and error estimation
- Sparse estimation by exponential weighting
This page was built for publication: Optimal model selection in heteroscedastic regression using piecewise polynomial functions