Random classification noise defeats all convex potential boosters
From MaRDI portal
Publication:1959553
DOI10.1007/s10994-009-5165-zzbMath1470.68139OpenAlexW2490901831WikidataQ56114422 ScholiaQ56114422MaRDI QIDQ1959553
Rocco A. Servedio, Philip M. Long
Publication date: 7 October 2010
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10994-009-5165-z
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (14)
Binary classification with corrupted labels ⋮ Classification with asymmetric label noise: consistency and maximal denoising ⋮ A non-intrusive correction algorithm for classification problems with corrupted data ⋮ Unnamed Item ⋮ Robust Support Vector Machines for Classification with Nonconvex and Smooth Losses ⋮ The risk of trivial solutions in bipartite top ranking ⋮ On the noise estimation statistics ⋮ Surprising properties of dropout in deep networks ⋮ Unnamed Item ⋮ Robustness of learning algorithms using hinge loss with outlier indicators ⋮ Robust Algorithms via PAC-Bayes and Laplace Distributions ⋮ Boosting in the Presence of Outliers: Adaptive Classification With Nonconvex Loss Functions ⋮ Soft-max boosting ⋮ A Framework of Learning Through Empirical Gain Maximization
Uses Software
Cites Work
- Unnamed Item
- A decision-theoretic generalization of on-line learning and an application to boosting
- A geometric approach to leveraging weak learners
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Population theory for boosting ensembles.
- On the Bayes-risk consistency of regularized boosting methods.
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Boosting a weak learning algorithm by majority
- Improved boosting algorithms using confidence-rated predictions
- Boosting with early stopping: convergence and consistency
- 10.1162/153244304773936072
- 10.1162/153244304773936108
- Learning Theory
- Boosting in the presence of noise
- Soft margins for AdaBoost
- An adaptive version of the boost by majority algorithm
This page was built for publication: Random classification noise defeats all convex potential boosters