A globally convergent hybrid conjugate gradient method with strong Wolfe conditions for unconstrained optimization
From MaRDI portal
Publication:2184373
DOI10.1007/s40096-019-00310-yzbMath1452.90321OpenAlexW2984487147MaRDI QIDQ2184373
P. Mtagulwa, P. Kaelo, M. Thuto
Publication date: 28 May 2020
Published in: Mathematical Sciences (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s40096-019-00310-y
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53)
Related Items
Another three-term conjugate gradient method close to the memoryless BFGS for large-scale unconstrained optimization problems, Two efficient modifications of AZPRP conjugate gradient method with sufficient descent property, A convergent hybrid three-term conjugate gradient method with sufficient descent property for unconstrained optimization, A decent three term conjugate gradient method with global convergence properties for large scale unconstrained optimization problems
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- New hybrid conjugate gradient method for unconstrained optimization
- The convergence properties of some new conjugate gradient methods
- Efficient generalized conjugate gradient algorithms. I: Theory
- A note about WYL's conjugate gradient method and its applications
- Efficient hybrid conjugate gradient techniques
- A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods
- New hybrid conjugate gradient method as a convex combination of LS and FR methods
- An efficient modified PRP-FR hybrid conjugate gradient method for solving unconstrained optimization problems
- A modified spectral conjugate gradient method with global convergence
- A hybrid conjugate gradient method with descent property for unconstrained optimization
- A nonmonotone hybrid conjugate gradient method for unconstrained optimization
- The proof of the sufficient descent condition of the Wei-Yao-Liu conjugate gradient method under the strong Wolfe-Powell line search
- An efficient modified Polak–Ribière–Polyak conjugate gradient method with global convergence properties
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The Limited Memory Conjugate Gradient Method
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- Benchmarking optimization software with performance profiles.