Some improved Dai-Yuan conjugate gradient methods for large-scale unconstrained optimization problems
From MaRDI portal
Publication:6578180
DOI10.1007/s12190-023-01918-zzbMath1541.65037MaRDI QIDQ6578180
Publication date: 25 July 2024
Published in: Journal of Applied Mathematics and Computing (Search for Journal in Brave)
optimizationconjugate gradient methodlarge-scale problemsweak-Wolfe-Powell line search techniqueDai-Yuan method
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A sufficient descent Dai-Yuan type nonlinear conjugate gradient method for unconstrained optimization problems
- A literature survey of benchmark functions for global optimisation problems
- A limited memory descent Perry conjugate gradient method
- Novel preconditioners based on quasi-Newton updates for nonlinear conjugate gradient methods
- A modified three-term PRP conjugate gradient algorithm for optimization models
- New quasi-Newton methods via higher order tensor models
- A modified Perry's conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations
- Some three-term conjugate gradient methods with the inexact line search condition
- Two modified Dai-Yuan nonlinear conjugate gradient methods
- New quasi-Newton equation and related methods for unconstrained optimization
- A family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS method
- Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search
- Two descent Dai-Yuan conjugate gradient methods for systems of monotone nonlinear equations
- Solving unconstrained optimization problems via hybrid CD-DY conjugate gradient methods with applications
- Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions
- Two modified DY conjugate gradient methods for unconstrained optimization problems
- Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization
- A Dai-Yuan conjugate gradient algorithm with sufficient descent and conjugacy conditions for unconstrained optimization
- New quasi-Newton methods for unconstrained optimization problems
- A Modified BFGS Algorithm for Unconstrained Optimization
- A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach
- GLOBAL CONVERGENCE OF A SPECIAL CASE OF THE DAI–YUAN FAMILY WITHOUT LINE SEARCH
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Global convergence of a new sufficient descent spectral three-term conjugate gradient class for large-scale optimization
- Function minimization by conjugate gradients
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Benchmarking optimization software with performance profiles.
- An efficient adaptive scaling parameter for the spectral conjugate gradient method
This page was built for publication: Some improved Dai-Yuan conjugate gradient methods for large-scale unconstrained optimization problems