Asymptotic Analysis of Conditioned Stochastic Gradient Descent

From MaRDI portal
Publication:6342091

arXiv2006.02745MaRDI QIDQ6342091

Author name not available (Why is that?)

Publication date: 4 June 2020

Abstract: In this paper, we investigate a general class of stochastic gradient descent (SGD) algorithms, called conditioned SGD, based on a preconditioning of the gradient direction. Using a discrete-time approach with martingale tools, we establish the weak convergence of the rescaled sequence of iterates for a broad class of conditioning matrices including stochastic first-order and second-order methods. Almost sure convergence results, which may be of independent interest, are also presented. When the conditioning matrix is an estimate of the inverse Hessian, the algorithm is proved to be asymptotically optimal. For the sake of completeness, we provide a practical procedure to achieve this minimum variance.




Has companion code repository: https://github.com/RemiLELUC/ConditionedSGD

No records found.








This page was built for publication: Asymptotic Analysis of Conditioned Stochastic Gradient Descent

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6342091)