A note on the global convergence theorem of accelerated adaptive Perry conjugate gradient methods
From MaRDI portal
Publication:1677476
DOI10.1016/j.cam.2017.10.024zbMath1381.90087OpenAlexW2766025103MaRDI QIDQ1677476
Publication date: 21 November 2017
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.cam.2017.10.024
convergence analysisunconstrained optimizationconjugate gradient algorithmaccelerated schemeself-scaling memoryless BFGS update
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37)
Related Items (2)
A class of accelerated conjugate-gradient-like methods based on a modified secant equation ⋮ Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length
Uses Software
Cites Work
- Unnamed Item
- Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update
- Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization
- Scaled conjugate gradient algorithms for unconstrained optimization
- Optimization theory and methods. Nonlinear programming
- Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
This page was built for publication: A note on the global convergence theorem of accelerated adaptive Perry conjugate gradient methods