Stochastic gradient method with Barzilai-Borwein step for unconstrained nonlinear optimization
From MaRDI portal
Publication:1995392
DOI10.1134/S106423072101010XzbMath1458.93268OpenAlexW3130150096MaRDI QIDQ1995392
Publication date: 23 February 2021
Published in: Journal of Computer and Systems Sciences International (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1134/s106423072101010x
Cites Work
- Unnamed Item
- Unnamed Item
- New method of stochastic approximation type
- Introductory lectures on convex optimization. A basic course.
- Modified two-point stepsize gradient methods for unconstrained optimization
- Two-Point Step Size Gradient Methods
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- On the Barzilai and Borwein choice of steplength for the gradient method
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- A Stochastic Approximation Method
This page was built for publication: Stochastic gradient method with Barzilai-Borwein step for unconstrained nonlinear optimization