Boosting Random Forests to Reduce Bias; One-Step Boosted Forest and Its Variance Estimate
From MaRDI portal
Publication:5066399
DOI10.1080/10618600.2020.1820345OpenAlexW2789461502MaRDI QIDQ5066399
Indrayudh Ghosal, Giles Hooker
Publication date: 29 March 2022
Published in: Journal of Computational and Graphical Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1803.08000
Related Items (1)
Uses Software
Cites Work
- Greedy function approximation: A gradient boosting machine.
- Bootstrap bias corrections for ensemble methods
- Variable importance in binary regression trees and forests
- Random Forests and Kernel Methods
- Asymptotic Statistics
- Estimation and Inference of Heterogeneous Treatment Effects using Random Forests
- Estimation and Accuracy After Model Selection
- Bias-corrected random forests in regression
- A Class of Statistics with Asymptotically Normal Distribution
- The elements of statistical learning. Data mining, inference, and prediction
- Random forests
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Boosting Random Forests to Reduce Bias; One-Step Boosted Forest and Its Variance Estimate