Estimating the algorithmic variance of randomized ensembles via the bootstrap
From MaRDI portal
Publication:666594
DOI10.1214/18-AOS1707zbMath1415.62045arXiv1907.08742MaRDI QIDQ666594
Publication date: 6 March 2019
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1907.08742
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Nonparametric statistical resampling methods (62G09) Learning and adaptive systems in artificial intelligence (68T05) Randomized algorithms (68W20) Prediction theory (aspects of stochastic processes) (60G25)
Related Items (9)
Measuring the Algorithmic Convergence of Randomized Ensembles: The Regression Setting ⋮ Randomized numerical linear algebra: Foundations and algorithms ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Estimating the algorithmic variance of randomized ensembles via the bootstrap ⋮ Estimating a sharp convergence bound for randomized ensembles ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Bootstrapping the operator norm in high dimensions: error estimation for covariance matrices and sketching
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Bagging predictors
- On the asymptotics of random forests
- Estimating the algorithmic variance of randomized ensembles via the bootstrap
- Sample size selection in optimization methods for machine learning
- Standard errors for bagged and random forest estimators
- Extrapolation methods theory and practice
- Analyzing bagging
- Weak convergence and empirical processes. With applications to statistics
- How large should ensembles of classifiers be?
- Consistency of random forests
- Comments on: ``A random forest guided tour
- Condition
- Random Forests and Kernel Methods
- Variance reduction in purely random forests
- Richardson Extrapolation and the Bootstrap
- Foundations of Modern Probability
- Practical Extrapolation Methods
- Estimation and Accuracy After Model Selection
- Properties of Bagged Nearest Neighbour Classifiers
- Random-projection Ensemble Classification
- Random Forests and Adaptive Nearest Neighbors
- The elements of statistical learning. Data mining, inference, and prediction
- Random forests
This page was built for publication: Estimating the algorithmic variance of randomized ensembles via the bootstrap