Multinomial Logistic Regression Algorithms via Quadratic Gradient

From MaRDI portal
Revision as of 09:39, 10 July 2024 by Import240710060729 (talk | contribs) (Created automatically from import240710060729)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:6407812

arXiv2208.06828MaRDI QIDQ6407812

Author name not available (Why is that?)

Publication date: 14 August 2022

Abstract: Multinomial logistic regression, also known by other names such as multiclass logistic regression and softmax regression, is a fundamental classification method that generalizes binary logistic regression to multiclass problems. A recently work proposed a faster gradient called extttquadraticgradient that can accelerate the binary logistic regression training, and presented an enhanced Nesterov's accelerated gradient (NAG) method for binary logistic regression. In this paper, we extend this work to multiclass logistic regression and propose an enhanced Adaptive Gradient Algorithm (Adagrad) that can accelerate the original Adagrad method. We test the enhanced NAG method and the enhanced Adagrad method on some multiclass-problem datasets. Experimental results show that both enhanced methods converge faster than their original ones respectively.




Has companion code repository: https://github.com/petitioner/ml.multiclasslrtraining








This page was built for publication: Multinomial Logistic Regression Algorithms via Quadratic Gradient

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6407812)