Use of majority votes in statistical learning
From MaRDI portal
Publication:6604473
DOI10.1002/wics.1362zbMATH Open1545.62166MaRDI QIDQ6604473
Publication date: 12 September 2024
Published in: Wiley Interdisciplinary Reviews. WIREs Computational Statistics (Search for Journal in Brave)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Greedy function approximation: A gradient boosting machine.
- Bagging predictors
- The Adaptive Lasso and Its Oracle Properties
- Boosting algorithms: regularization, prediction and model fitting
- Random lasso
- New multicategory boosting algorithms based on multicategory Fisher-consistent losses
- Bayesian model averaging: A tutorial. (with comments and a rejoinder).
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- High-dimensional graphs and variable selection with the Lasso
- Boosting with early stopping: convergence and consistency
- Measuring the Accuracy of Diagnostic Systems
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Boosting With theL2Loss
- Stability Selection
- Combining Pattern Classifiers
- Bayes Factors
- Majority Voting by Independent Classifiers Can Increase Error Rates
- Random forests
- Stochastic gradient boosting.
This page was built for publication: Use of majority votes in statistical learning