A boosting method with asymmetric mislabeling probabilities which depend on covariates
From MaRDI portal
Publication:2512782
DOI10.1007/s00180-011-0250-8zbMath1304.65037OpenAlexW2094691905MaRDI QIDQ2512782
Publication date: 30 January 2015
Published in: Computational Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00180-011-0250-8
Related Items
Canonical forest, Robust high-dimensional regression for data with anomalous responses, Robust mislabel logistic regression without modeling mislabel probabilities
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Greedy function approximation: A gradient boosting machine.
- Bagging predictors
- A decision-theoretic generalization of on-line learning and an application to boosting
- On the Bayes-risk consistency of regularized boosting methods.
- Using a VOM model for reconstructing potential coding regions in EST sequences
- Linear Discriminant Analysis with Misallocation in Training Samples
- A ROBUST BOOSTING METHOD FOR MISLABELED DATA
- Robustifying AdaBoost by Adding the Naive Error Rate
- Information Geometry of U-Boost and Bregman Divergence
- Robust Loss Functions for Boosting
- Maximum Likelihood Estimation of Misspecified Models
- Soft margins for AdaBoost
- Random forests