Least-squares-based three-term conjugate gradient methods
DOI10.1186/s13660-020-2301-6zbMath1503.90135OpenAlexW3031039957MaRDI QIDQ2069298
Zengru Cui, Chun-Ming Tang, Shuang-Yu Li
Publication date: 20 January 2022
Published in: Journal of Inequalities and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1186/s13660-020-2301-6
global convergencethree-term conjugate gradient methodWolfe-Powell line searchsufficient descent propertyleast-squares technique
Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Methods of reduced gradient type (90C52)
Related Items (3)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- New hybrid conjugate gradient projection method for the convex constrained equations
- Two modified three-term conjugate gradient methods with sufficient descent property
- New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction
- Some three-term conjugate gradient methods with the inexact line search condition
- The convergence properties of some new conjugate gradient methods
- A family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS method
- Efficient spectral computation of the stationary states of rotating Bose-Einstein condensates by preconditioned nonlinear conjugate gradient methods
- A hybridization of the Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods
- An efficient modified PRP-FR hybrid conjugate gradient method for solving unconstrained optimization problems
- A scaled three-term conjugate gradient method for unconstrained optimization
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Quasi-Newton Methods, Motivation and Theory
- CUTE
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A new spectral conjugate gradient method for large-scale unconstrained optimization
- Some new three-term Hestenes–Stiefel conjugate gradient methods with affine combination
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Some descent three-term conjugate gradient methods and their global convergence
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- Benchmarking optimization software with performance profiles.
This page was built for publication: Least-squares-based three-term conjugate gradient methods