AGD: an Auto-switchable Optimizer using Stepwise Gradient Difference for Preconditioning Matrix (Q6462011)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: AGD: an Auto-switchable Optimizer using Stepwise Gradient Difference for Preconditioning Matrix |
preprint article from arXiv
| Language | Label | Description | Also known as |
|---|---|---|---|
| English | AGD: an Auto-switchable Optimizer using Stepwise Gradient Difference for Preconditioning Matrix |
preprint article from arXiv |
Statements
4 December 2023
0 references
cs.LG
0 references
cs.DC
0 references
math.OC
0 references
Yun Yue
0 references
Zhiling Ye
0 references
Jiadi Jiang
0 references
Yongchao Liu
0 references
Ke Zhang
0 references