Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
From MaRDI portal
Publication:970585
DOI10.1007/s11075-009-9321-0zbMath1192.65074OpenAlexW2081241643MaRDI QIDQ970585
Publication date: 19 May 2010
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-009-9321-0
unconstrained optimizationhybrid conjugate gradient methodNewton directionNumerical comparisonsModified secant condition
Related Items
New hybrid conjugate gradient method for unconstrained optimization ⋮ A descent hybrid conjugate gradient method based on the memoryless BFGS update ⋮ An efficient hybrid conjugate gradient method with sufficient descent property for unconstrained optimization ⋮ A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach ⋮ New hybrid conjugate gradient method as a convex combination of LS and FR methods ⋮ Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization ⋮ Two modified scaled nonlinear conjugate gradient methods ⋮ A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods ⋮ A hybrid Riemannian conjugate gradient method for nonconvex optimization problems ⋮ Two effective hybrid conjugate gradient algorithms based on modified BFGS updates ⋮ An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization ⋮ An efficient hybrid conjugate gradient method for unconstrained optimization ⋮ Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization ⋮ A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique ⋮ A Conjugate Gradient Method Based on a Modified Secant Relation for Unconstrained Optimization ⋮ Two hybrid nonlinear conjugate gradient methods based on a modified secant equation ⋮ A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function ⋮ Comments on “A hybrid conjugate gradient method based on a quadratic relaxation of the Dai-Yuan hybrid conjugate gradient parameter” ⋮ A hybrid conjugate gradient method based on a quadratic relaxation of the Dai–Yuan hybrid conjugate gradient parameter ⋮ Comments on ``Another hybrid conjugate gradient algorithm for unconstrained optimization by Andrei
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- On the limited memory BFGS method for large scale optimization
- Hybrid conjugate gradient algorithm for unconstrained optimization
- Acceleration of conjugate gradient algorithms for unconstrained optimization
- Numerical study of a relaxed variational problem from optimal design
- On a problem of the theory of lubrication governed by a variational inequality
- New quasi-Newton equation and related methods for unconstrained optimization
- Global convergence result for conjugate gradient methods
- Another hybrid conjugate gradient algorithm for unconstrained optimization
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Restart procedures for the conjugate gradient method
- CUTE
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The Conjugate Gradient Method for Linear and Nonlinear Operator Equations
- Convergence Conditions for Ascent Methods
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- New properties of a nonlinear conjugate gradient method
- Benchmarking optimization software with performance profiles.
- An efficient hybrid conjugate gradient method for unconstrained optimization