Deformation of log-likelihood loss function for multiclass boosting
From MaRDI portal
Publication:1784701
DOI10.1016/j.neunet.2010.05.009zbMath1401.62096OpenAlexW2073064881WikidataQ46935328 ScholiaQ46935328MaRDI QIDQ1784701
Publication date: 27 September 2018
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2010.05.009
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (2)
Fully corrective boosting with arbitrary loss and regularization ⋮ Boosting conditional probability estimators
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- New multicategory boosting algorithms based on multicategory Fisher-consistent losses
- Multicategory classification by support vector machines
- A decision-theoretic generalization of on-line learning and an application to boosting
- Inference for the generalization error
- A note on margin-based loss functions in classification
- Classification by pairwise coupling
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- On the Bayes-risk consistency of regularized boosting methods.
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Support-vector networks
- Improved boosting algorithms using confidence-rated predictions
- Boosting for high-dimensional linear models
- Boosting with early stopping: convergence and consistency
- 10.1162/15324430152733133
- 10.1162/15324430260185628
- Robust Boosting Algorithm Against Mislabeling in Multiclass Problems
- Robustifying AdaBoost by Adding the Naive Error Rate
- 10.1162/1532443041424319
- 10.1162/153244304773936072
- Information Geometry of U-Boost and Bregman Divergence
- Advanced Lectures on Machine Learning
- Robust Loss Functions for Boosting
- Multicategory Support Vector Machines
- Convexity, Classification, and Risk Bounds
- Boosting in the presence of noise
- Soft margins for AdaBoost
- The elements of statistical learning. Data mining, inference, and prediction
This page was built for publication: Deformation of log-likelihood loss function for multiclass boosting