SVGD: A Virtual Gradients Descent Method for Stochastic Optimization

From MaRDI portal
Publication:6321785

arXiv1907.04021MaRDI QIDQ6321785

Author name not available (Why is that?)

Publication date: 9 July 2019

Abstract: Inspired by dynamic programming, we propose Stochastic Virtual Gradient Descent (SVGD) algorithm where the Virtual Gradient is defined by computational graph and automatic differentiation. The method is computationally efficient and has little memory requirements. We also analyze the theoretical convergence properties and implementation of the algorithm. Experimental results on multiple datasets and network models show that SVGD has advantages over other stochastic optimization methods.




Has companion code repository: https://github.com/LizhengMathAi/svgd








This page was built for publication: SVGD: A Virtual Gradients Descent Method for Stochastic Optimization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6321785)