Global convergence of a modified Fletcher–Reeves conjugate gradient method with Wolfe line search
DOI10.1142/S1793557120500813zbMath1464.65058OpenAlexW2913953202MaRDI QIDQ5113081
Mohamed Chiheb Eddine Sellami, Badreddine Sellami
Publication date: 10 June 2020
Published in: Asian-European Journal of Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1142/s1793557120500813
unconstrained optimizationglobal convergenceconjugate gradient methodline searchsufficient descentcondition
Numerical mathematical programming methods (65K05) Convex programming (90C25) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Combinatorial optimization (90C27)
Uses Software
Cites Work
- Unnamed Item
- A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Efficient generalized conjugate gradient algorithms. I: Theory
- Efficient hybrid conjugate gradient techniques
- Global convergence result for conjugate gradient methods
- Scaled conjugate gradient algorithms for unconstrained optimization
- Conjugate Gradient Methods with Inexact Searches
- CUTE
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Modified Hestenes-Steifel conjugate gradient coefficient for unconstrained optimization
- Function minimization by conjugate gradients
- Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A spectral conjugate gradient method for unconstrained optimization
- An efficient hybrid conjugate gradient method for unconstrained optimization
This page was built for publication: Global convergence of a modified Fletcher–Reeves conjugate gradient method with Wolfe line search