New conjugate gradient-like methods for unconstrained optimization
From MaRDI portal
Publication:2926084
DOI10.1080/10556788.2014.898764zbMath1308.65090OpenAlexW1979612361MaRDI QIDQ2926084
Publication date: 29 October 2014
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10556788.2014.898764
unconstrained optimizationglobal convergenceconjugate gradient methodline searchsufficient descent condition
Uses Software
Cites Work
- A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties
- A gradient-related algorithm with inexact line searches
- Global convergence of a memory gradient method for unconstrained optimization
- Global convergence result for conjugate gradient methods
- Global convergence of the Fletcher-Reeves algorithm with inexact linesearch
- Memory gradient method for the minimization of functions
- Study on a supermemory gradient method for the minimization of functions
- A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization
- Algorithm 851
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Two-Point Step Size Gradient Methods
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Convergence properties of the Fletcher-Reeves method
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- A Two-Term PRP-Based Descent Method
- CUTEr and SifDec
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Benchmarking optimization software with performance profiles.
This page was built for publication: New conjugate gradient-like methods for unconstrained optimization