On weak base hypotheses and their implications for boosting regression and classification
From MaRDI portal
Publication:1848929
DOI10.1214/aos/1015362184zbMath1012.62066OpenAlexW1575017763MaRDI QIDQ1848929
Publication date: 14 November 2002
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1214/aos/1015362184
Nonparametric regression and quantile regression (62G08) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05) Nonparametric inference (62G99)
Related Items
Process consistency for AdaBoost. ⋮ Properties of Bagged Nearest Neighbour Classifiers ⋮ Bandwidth choice for nonparametric classification ⋮ A stochastic approximation view of boosting ⋮ SVM-boosting based on Markov resampling: theory and algorithm
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Bagging predictors
- Decision theoretic generalizations of the PAC model for neural net and other learning applications
- Majority gates vs. general weighted threshold gates
- A decision-theoretic generalization of on-line learning and an application to boosting
- Arcing classifiers. (With discussion)
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Boosting a weak learning algorithm by majority
- Ideal spatial adaptation by wavelet shrinkage
- Minimax nonparametric classification .I. Rates of convergence
- Matching pursuits with time-frequency dictionaries
- Using iterated bagging to debias regressions