A mini-batch stochastic conjugate gradient algorithm with variance reduction
From MaRDI portal
Publication:6064058
DOI10.1007/S10898-022-01205-4zbMath1528.90191MaRDI QIDQ6064058
Publication date: 8 November 2023
Published in: Journal of Global Optimization (Search for Journal in Brave)
Convex programming (90C25) Numerical optimization and variational techniques (65K10) Stochastic programming (90C15) Methods of reduced gradient type (90C52)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Large-Scale Machine Learning with Stochastic Gradient Descent
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Some methods of speeding up the convergence of iteration methods
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- A Stochastic Approximation Method
This page was built for publication: A mini-batch stochastic conjugate gradient algorithm with variance reduction