An efficient adaptive three-term extension of the Hestenes–Stiefel conjugate gradient method
From MaRDI portal
Publication:4631766
DOI10.1080/10556788.2017.1418870zbMath1411.65081OpenAlexW2897427749WikidataQ129089105 ScholiaQ129089105MaRDI QIDQ4631766
No author found.
Publication date: 23 April 2019
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10556788.2017.1418870
unconstrained optimizationglobal convergenceconjugate gradient methodconjugacy conditionsufficient descent conditionnumerical comparison
Related Items
A convergent hybrid three-term conjugate gradient method with sufficient descent property for unconstrained optimization ⋮ Some three-term conjugate gradient methods with the new direction structure
Cites Work
- Unnamed Item
- Unnamed Item
- On three-term conjugate gradient algorithms for unconstrained optimization
- A simple three-term conjugate gradient algorithm for unconstrained optimization
- A new three-term conjugate gradient algorithm for unconstrained optimization
- Comment on ``A new three-term conjugate gradient method for unconstrained problem
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- Two modified three-term conjugate gradient methods with sufficient descent property
- A three-term conjugate gradient algorithm for large-scale unconstrained optimization problems
- An accurate active set Newton algorithm for large scale bound constrained optimization.
- An active set strategy based on the multiplier function or the gradient.
- Parallel SSLE algorithm for large scale constrained optimization
- A modified Perry's conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations
- Conjugate gradient methods with Armijo-type line searches.
- A decomposition method for large-scale box constrained optimization
- An active set modified Polak-Ribiére-Polyak method for large-scale nonlinear bound constrained optimization
- A class of modified FR conjugate gradient method and applications to non-negative matrix factorization
- A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization
- A modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition
- A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- CUTE
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A NEW THREE–TERM CONJUGATE GRADIENT METHOD WITH DESCENT DIRECTION FOR UNCONSTRAINED OPTIMIZATION
- Some new three-term Hestenes–Stiefel conjugate gradient methods with affine combination
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The Limited Memory Conjugate Gradient Method
- Some descent three-term conjugate gradient methods and their global convergence
- A Two-Term PRP-Based Descent Method
- Convergence Conditions for Ascent Methods
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Benchmarking optimization software with performance profiles.
This page was built for publication: An efficient adaptive three-term extension of the Hestenes–Stiefel conjugate gradient method