Noise peeling methods to improve boosting algorithms
From MaRDI portal
Publication:1660240
DOI10.1016/j.csda.2015.06.010zbMath1468.62135OpenAlexW894053766MaRDI QIDQ1660240
J. Brian Gray, Waldyn Martinez
Publication date: 15 August 2018
Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.csda.2015.06.010
Computational methods for problems pertaining to statistics (62-08) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (3)
2nd special issue on robust analysis of complex data ⋮ Editorial: Special issue on advances in data mining and robust statistics ⋮ Boosting in the Presence of Outliers: Adaptive Classification With Nonconvex Loss Functions
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Greedy function approximation: A gradient boosting machine.
- Bagging predictors
- Support vector data description
- A local boosting algorithm for solving classification problems
- Robustified \(L_2\) boosting
- A decision-theoretic generalization of on-line learning and an application to boosting
- Improved generalization through explicit optimization of margins
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Boosting a weak learning algorithm by majority
- Support-vector networks
- A theory of the learnable
- 10.1162/153244304773936072
- SVDD-Based Pattern Denoising
- Boosting in the presence of noise
- Soft margins for AdaBoost
- An adaptive version of the boost by majority algorithm
- Stochastic gradient boosting.
This page was built for publication: Noise peeling methods to improve boosting algorithms