An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization
DOI10.1007/s10589-019-00143-4zbMath1433.90126OpenAlexW2984664785WikidataQ126848592 ScholiaQ126848592MaRDI QIDQ2301132
Publication date: 28 February 2020
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-019-00143-4
global convergenceconjugate gradient algorithmquasi-Newton methodpreconditioned conjugate gradient algorithmlimited memory
Large-scale problems in mathematical programming (90C06) Nonconvex programming, global optimization (90C26) Numerical optimization and variational techniques (65K10) Methods of quasi-Newton type (90C53) Complexity and performance of numerical algorithms (65Y20)
Related Items (6)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization
- New quasi-Newton methods via higher order tensor models
- A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems
- On the limited memory BFGS method for large scale optimization
- Modified two-point stepsize gradient methods for unconstrained optimization
- An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization
- An efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- New quasi-Newton methods for unconstrained optimization problems
- On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
- A Modified BFGS Algorithm for Unconstrained Optimization
- Algorithm 851
- Two-Point Step Size Gradient Methods
- On the Convergence of a New Conjugate Gradient Algorithm
- Numerical Optimization
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The Limited Memory Conjugate Gradient Method
- CUTEr and SifDec
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Benchmarking optimization software with performance profiles.
This page was built for publication: An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization