Boosting in the Presence of Outliers: Adaptive Classification With Nonconvex Loss Functions
DOI10.1080/01621459.2016.1273116zbMath1398.62167arXiv1510.01064OpenAlexW2191029871MaRDI QIDQ4962433
Alexander Hanbo Li, Jelena Bradic
Publication date: 2 November 2018
Published in: Journal of the American Statistical Association (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1510.01064
Nonparametric robustness (62G35) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Applications of statistics to biology and medical sciences; meta analysis (62P10) Robustness and adaptive procedures (parametric inference) (62F35)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Greedy function approximation: A gradient boosting machine.
- Classification with asymmetric label noise: consistency and maximal denoising
- Robustified \(L_2\) boosting
- Finite sample breakdown points of projection based multivariate location and scatter statistics
- A decision-theoretic generalization of on-line learning and an application to boosting
- Noise peeling methods to improve boosting algorithms
- Arcing classifiers. (With discussion)
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Robust fitting of the binomial model.
- Empirical margin distributions and bounding the generalization error of combined classifiers
- Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory
- Population theory for boosting ensembles.
- Process consistency for AdaBoost.
- Boosting a weak learning algorithm by majority
- Random classification noise defeats all convex potential boosters
- Breakdown and groups. (With discussions and rejoinder)
- Boosting with early stopping: convergence and consistency
- Learning in the Presence of Malicious Errors
- Least Median of Squares Regression
- What is invexity?
- Breakdown in Nonlinear Regression
- The Influence Curve and Its Role in Robust Estimation
- Comprehensive Definitions of Breakdown Points for Independent and Dependent Observations
- Variable Selection in Nonparametric Classification Via Measurement Error Model Selection Likelihoods
- Explaining AdaBoost
- Online Learning of Noisy Data
- Robust Loss Functions for Boosting
- Variable Selection for Support Vector Machines in Moderately High Dimensions
- Convexity, Classification, and Risk Bounds
- Prediction Games and Arcing Algorithms
- Boosting in the presence of noise
- Soft margins for AdaBoost
- An adaptive version of the boost by majority algorithm
- Logistic regression, AdaBoost and Bregman distances
This page was built for publication: Boosting in the Presence of Outliers: Adaptive Classification With Nonconvex Loss Functions