Another three-term conjugate gradient method close to the memoryless BFGS for large-scale unconstrained optimization problems
From MaRDI portal
Publication:821530
DOI10.1007/s00009-021-01853-yzbMath1477.90040OpenAlexW3198177180MaRDI QIDQ821530
Publication date: 20 September 2021
Published in: Mediterranean Journal of Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00009-021-01853-y
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30)
Related Items
Cites Work
- A literature survey of benchmark functions for global optimisation problems
- A new approach based on the Newton's method to solve systems of nonlinear equations
- Efficient generalized conjugate gradient algorithms. I: Theory
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- A limited memory BFGS-type method for large-scale unconstrained optimization
- A double parameter scaled BFGS method for unconstrained optimization
- A three-term conjugate gradient algorithm with quadratic convergence for unconstrained optimization problems
- A new modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent property and its global convergence
- A limited memory BFGS method for solving large-scale symmetric nonlinear equations
- A globally convergent hybrid conjugate gradient method with strong Wolfe conditions for unconstrained optimization
- A double parameter self-scaling memoryless BFGS method for unconstrained optimization
- An accelerated version of Newton's method with convergence order \(\sqrt{3}+1\)
- An efficient modified PRP-FR hybrid conjugate gradient method for solving unconstrained optimization problems
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- Modified three-term conjugate gradient method and its applications
- Some three-term conjugate gradient methods with the new direction structure
- A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method
- The global convergence of a modified BFGS method for nonconvex functions
- An acceleration of the continuous Newton's method
- A sufficient descent Liu–Storey conjugate gradient method and its global convergence
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Benchmarking optimization software with performance profiles.
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Another three-term conjugate gradient method close to the memoryless BFGS for large-scale unconstrained optimization problems