Unbiased Boosting Estimation for Censored Survival Data
From MaRDI portal
Publication:6185138
DOI10.5705/ss.202021.0050MaRDI QIDQ6185138
Publication date: 29 January 2024
Published in: Statistica Sinica (Search for Journal in Brave)
consistencyempirical processessurvival datamachine learningboostingright-censoringadjusted loss functions
Cites Work
- Unnamed Item
- Greedy function approximation: A gradient boosting machine.
- Boosting algorithms: regularization, prediction and model fitting
- Consistency of random survival forests
- A note on the uniform consistency of the Kaplan-Meier estimator
- Uniform consistency of the kernel conditional Kaplan-Meier estimate
- Process consistency for AdaBoost.
- On the Bayes-risk consistency of regularized boosting methods.
- Boosting a weak learning algorithm by majority
- Curse of dimensionality and related issues in nonparametric functional regression
- Boosted nonparametric hazards with time-dependent covariates
- Buckley-James boosting for survival analysis with high-dimensional biomarker data
- A gradient boosting algorithm for survival analysis via direct optimization of concordance index
- Boosting with early stopping: convergence and consistency
- Application of “Aggregated Classifiers” in Survival Time Studies
- Boosting method for nonlinear transformation models with censored survival data
- Survival ensembles
- Linear regression with censored data
- Boosting With theL2Loss
- Recursively Imputed Survival Trees
- Consistency of survival tree and forest models: splitting bias and correction
- A Doubly Robust Censoring Unbiased Transformation
This page was built for publication: Unbiased Boosting Estimation for Censored Survival Data