Developing a new conjugate gradient algorithm with the benefit of some desirable properties of the Newton algorithm for unconstrained optimization
DOI10.11948/20230268MaRDI QIDQ6615082
Hamza Guebbai, Naima Hamel, Noureddine Benrabia, Mourad Ghiat
Publication date: 8 October 2024
Published in: Journal of Applied Analysis and Computation (Search for Journal in Brave)
global convergenceconjugate gradient algorithmNewton methodquadratic convergence behavioruconstraind optimization
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Newton-type methods (49M15)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- New hybrid conjugate gradient method for unconstrained optimization
- The convergence properties of some new conjugate gradient methods
- Newton's method and its use in optimization
- Efficient generalized conjugate gradient algorithms. I: Theory
- A hybrid FR-DY conjugate gradient algorithm for unconstrained optimization with application in portfolio selection
- Two sufficient descent three-term conjugate gradient methods for unconstrained optimization problems with applications in compressive sensing
- A hybrid conjugate gradient based approach for solving unconstrained optimization and motion control problems
- New hybrid conjugate gradient method as a convex combination of LS and FR methods
- The proof of the sufficient descent condition of the Wei-Yao-Liu conjugate gradient method under the strong Wolfe-Powell line search
- Another hybrid conjugate gradient algorithm for unconstrained optimization
- Implicit steepest descent algorithm for optimization with orthogonality constraints
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Linear Hybridization of Dai-Yuan and Hestenes-Stiefel Conjugate Gradient Method for Unconstrained Optimization
- A Noise-Tolerant Quasi-Newton Algorithm for Unconstrained Optimization
- A nonmonotone gradient method for constrained multiobjective optimization problems
- Function minimization by conjugate gradients
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- Benchmarking optimization software with performance profiles.
This page was built for publication: Developing a new conjugate gradient algorithm with the benefit of some desirable properties of the Newton algorithm for unconstrained optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6615082)