On surrogate loss functions and \(f\)-divergences
DOI10.1214/08-AOS595zbMath1162.62060arXivmath/0510521OpenAlexW3099188570MaRDI QIDQ1020983
Martin J. Wainwright, Michael I. Jordan, XuanLong Nguyen
Publication date: 4 June 2009
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/math/0510521
discriminant analysis\(f\)-divergencesbinary classificationquantizer designBayes consistencystatistical machine learningAli-Silvey divergencesnonparametric decentralized detectionsurrogate losses
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Computational learning theory (68Q32) Bayesian inference (62F15) Statistical aspects of information-theoretic topics (62B10) Nonparametric inference (62G99)
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Convex functions, monotone operators and differentiability.
- Process consistency for AdaBoost.
- On the Bayes-risk consistency of regularized boosting methods.
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Support-vector networks
- On the Design and Comparison of Certain Dichotomous Experiments
- Consistency of Support Vector Machines and Other Regularized Kernel Classifiers
- Applications of Ali-Silvey Distance Measures in the Design Generalized Quantizers for Binary Decision Systems
- Some inequalities for information divergence and related measures of discrimination
- 10.1162/153244304773936108
- Nonparametric decentralized detection using kernel methods
- Convex Analysis
- Convexity, Classification, and Risk Bounds
- Equivalent Comparisons of Experiments