A class of one parameter conjugate gradient methods
From MaRDI portal
Publication:1664259
DOI10.1016/j.amc.2015.05.115zbMath1410.90246OpenAlexW2216777797MaRDI QIDQ1664259
Shengwei Yao, Liangshuo Ning, Feifei Li, Xi-wen Lu
Publication date: 24 August 2018
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.amc.2015.05.115
unconstrained optimizationglobal convergenceconjugate gradient methodWolfe line searchcontinuous optimization
Nonlinear programming (90C30) Numerical optimization and variational techniques (65K10) Methods of reduced gradient type (90C52)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property
- Some global convergence properties of the Wei-Yao-Liu conjugate gradient method with inexact line search
- A class of globally convergent conjugate gradient methods
- On the convergence property of the DFP algorithm
- The convergence properties of some new conjugate gradient methods
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- Efficient generalized conjugate gradient algorithms. I: Theory
- A conjugate gradient method for unconstrained optimization problems
- Two new conjugate gradient methods based on modified secant equations
- A note about WYL's conjugate gradient method and its applications
- An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation
- A conjugate gradient method with global convergence for large-scale unconstrained optimization problems
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Further insight into the convergence of the Fletcher-Reeves method
- The proof of the sufficient descent condition of the Wei-Yao-Liu conjugate gradient method under the strong Wolfe-Powell line search
- Further studies on the Wei-Yao-Liu nonlinear conjugate gradient method
- Unified approach to quadratically convergent algorithms for function minimization
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- Technical Note—A Modified Conjugate Gradient Algorithm
- Testing Unconstrained Optimization Software
- CUTE
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The Conjugate Gradient Method for Linear and Nonlinear Operator Equations
- Convergence Conditions for Ascent Methods
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Two fundamental convergence theorems for nonlinear conjugate gradient methods and their applications
- Benchmarking optimization software with performance profiles.
This page was built for publication: A class of one parameter conjugate gradient methods