A new conjugate gradient method based on quasi-Newton equation for unconstrained optimization
DOI10.1016/j.cam.2018.10.035OpenAlexW2899329799MaRDI QIDQ1713190
Jianglan Yu, Juanjuan Shi, Xiao Liang Dong, Xiang-Li Li
Publication date: 24 January 2019
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.cam.2018.10.035
global convergencequasi-Newton equationspectral conjugate gradient methodunconstrained optimization problems
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53) Methods of reduced gradient type (90C52)
Related Items (8)
Cites Work
- Unnamed Item
- Parallel algorithms for large-scale linearly constrained minimization problem
- An accurate active set Newton algorithm for large scale bound constrained optimization.
- Parallel SSLE algorithm for large scale constrained optimization
- A spectral PRP conjugate gradient methods for nonconvex optimization problem based on modified line search
- Exploiting damped techniques for nonlinear conjugate gradient methods
- Quasi-Newton based preconditioning and damped quasi-Newton schemes for nonlinear conjugate gradient methods
- A decomposition method for large-scale box constrained optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- A new class of spectral conjugate gradient methods based on a modified secant equation for unconstrained optimization
- A new class of smoothing functions and a smoothing Newton method for complementarity problems
- Efficient generalized conjugate gradient algorithms. II: Implementation
- Three modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Spectral conjugate gradient methods with sufficient descent property for large-scale unconstrained optimization
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Function minimization by conjugate gradients
- A Two-Term PRP-Based Descent Method
- Methods of conjugate gradients for solving linear systems
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
This page was built for publication: A new conjugate gradient method based on quasi-Newton equation for unconstrained optimization