Logistic regression, AdaBoost and Bregman distances
From MaRDI portal
Publication:5959943
DOI10.1023/A:1013912006537zbMath0998.68123OpenAlexW1546961578MaRDI QIDQ5959943
No author found.
Publication date: 11 April 2002
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1023/a:1013912006537
Related Items
Duality for Bregman projections onto translated cones and affine subspaces., A secant-based Nesterov method for convex functions, Tikhonov, Ivanov and Morozov regularization for support vector machine learning, On the Bayes-risk consistency of regularized boosting methods., The information geometry of Bregman divergences and some applications in multi-expert reasoning, Fingerprint classification based on Adaboost learning from singularity features, Density-ratio matching under the Bregman divergence: a unified framework of density-ratio estimation, A Fisher consistent multiclass loss function with variable margin on positive examples, Recovering occlusion boundaries from an image, Conformal mirror descent with logarithmic divergences, Incremental Majorization-Minimization Optimization with Application to Large-Scale Machine Learning, Re-examination of Bregman functions and new properties of their divergences, A co-classification approach to learning from multilingual corpora, On the equivalence of weak learnability and linear separability: new relaxations and efficient boosting algorithms, A noise-detection based AdaBoost algorithm for mislabeled data, Mining adversarial patterns via regularized loss minimization, Can a corporate network and news sentiment improve portfolio optimization using the Black–Litterman model?, Early stopping in \(L_{2}\)Boosting, Affine invariant divergences associated with proper composite scoring rules and their applications, Texture and shape information fusion for facial expression and facial action unit recognition, The synergy between PAV and AdaBoost, Putting objects in perspective, Automated trading with boosting and expert weighting, Analysis of boosting algorithms using the smooth margin function, The synergy between PAV and AdaBoost, Robust Algorithms via PAC-Bayes and Laplace Distributions, Theory of Classification: a Survey of Some Recent Advances, Boosting in the Presence of Outliers: Adaptive Classification With Nonconvex Loss Functions, Sketching information divergences, Surrogate maximization/minimization algorithms and extensions, Some Universal Insights on Divergences for Statistics, Machine Learning and Artificial Intelligence, Randomized Gradient Boosting Machine, Parallelizing AdaBoost by weights dynamics, Extended Newton Methods for Multiobjective Optimization: Majorizing Function Technique and Convergence Analysis, On the convergence of a block-coordinate incremental gradient method, A boosting inspired personalized threshold method for sepsis screening, Boosting with early stopping: convergence and consistency, A new accelerated proximal boosting machine with convergence rate \(O(1/t^2)\), APPROXIMATE BREGMAN NEAR NEIGHBORS IN SUBLINEAR TIME: BEYOND THE TRIANGLE INEQUALITY