scientific article; zbMATH DE number 7307469
From MaRDI portal
Publication:5149226
Publication date: 8 February 2021
Full work available at URL: https://arxiv.org/abs/1911.00190
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Related Items (6)
Rates of convergence for random forests via generalized U-statistics ⋮ An embedded model estimator for non-stationary random functions using multiple secondary variables ⋮ Adaptive Bayesian Sum of Trees Model for Covariate-Dependent Spectral Analysis ⋮ Efficient permutation testing of variable importance measures by the example of random forests ⋮ Large Scale Prediction with Decision Trees ⋮ Interval Censored Recursive Forests
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Bagging predictors
- Random survival forests
- On the layered nearest neighbour estimate, the bagged nearest neighbour estimate and the random forest method in regression and classification
- Estimating the algorithmic variance of randomized ensembles via the bootstrap
- Standard errors for bagged and random forest estimators
- Relaxed Lasso
- Unbiased split selection for classification trees based on the Gini index
- Hedonic housing prices and the demand for clean air
- Bayesian backfitting. (With comments and a rejoinder).
- Randomizing outputs to increase prediction accuracy
- Neural random forests
- Tree based weighted learning for estimating individualized treatment rules with censored data
- Consistency of random forests
- A random forest guided tour
- Atomic Decomposition by Basis Pursuit
- Degrees of freedom and model search
- Random Forests and Kernel Methods
- Survival ensembles
- How Biased is the Apparent Error Rate of a Prediction Rule?
- Estimation and Inference of Heterogeneous Treatment Effects using Random Forests
- Tuning parameters in random forests
- Impact of subsampling and tree depth on random forests
- Reconciling modern machine-learning practice and the classical bias–variance trade-off
- Censoring Unbiased Regression Trees and Ensembles
- Random Forests and Adaptive Nearest Neighbors
- Random forests
This page was built for publication: