A descent nonlinear conjugate gradient method for large-scale unconstrained optimization
DOI10.1016/j.amc.2006.08.087zbMath1117.65097OpenAlexW2088712421MaRDI QIDQ883860
Yanlin Zhao, Gaohang Yu, Zeng-xin Wei
Publication date: 12 June 2007
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.amc.2006.08.087
unconstrained optimizationglobal convergencenumerical resultslarge-scale optimizationconjugate gradient methodWolfe conditionsPolak-Ribière-Polyak methodZoutendijk condition
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30)
Related Items (16)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems
- Efficient generalized conjugate gradient algorithms. I: Theory
- Efficient hybrid conjugate gradient techniques
- Global convergence result for conjugate gradient methods
- Global convergence of the Fletcher-Reeves algorithm with inexact linesearch
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Restart procedures for the conjugate gradient method
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- Convergence properties of the Fletcher-Reeves method
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Function minimization by conjugate gradients
- Convergence Conditions for Ascent Methods
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- Global convergence of conjugate gradient methods without line search
This page was built for publication: A descent nonlinear conjugate gradient method for large-scale unconstrained optimization