A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei
From MaRDI portal
Publication:453599
DOI10.1007/s10589-011-9413-3zbMath1269.90105OpenAlexW2076837988MaRDI QIDQ453599
Publication date: 27 September 2012
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-011-9413-3
Related Items (8)
Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update ⋮ A modified scaling parameter for the memoryless BFGS updating formula ⋮ Two modified scaled nonlinear conjugate gradient methods ⋮ On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae ⋮ A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization ⋮ A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function ⋮ On the sufficient descent condition of the Hager-Zhang conjugate gradient methods ⋮ Scaled nonlinear conjugate gradient methods for nonlinear least squares problems
Uses Software
Cites Work
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Scaled conjugate gradient algorithms for unconstrained optimization
- Optimization theory and methods. Nonlinear programming
- Two-Point Step Size Gradient Methods
- Conjugate Gradient Methods with Inexact Searches
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- A spectral conjugate gradient method for unconstrained optimization
This page was built for publication: A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei