A descent family of the spectral Hestenes–Stiefel method by considering the quasi-Newton method
From MaRDI portal
Publication:6175562
DOI10.1080/10556788.2022.2142585OpenAlexW4313560949MaRDI QIDQ6175562
Maryam Khoshsimaye-Bargard, Ali Ashrafi
Publication date: 24 July 2023
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10556788.2022.2142585
global convergencenonlinear programmingquasi-Newton methodspectral conjugate gradient methodsufficient descent property
Nonlinear programming (90C30) Methods of quasi-Newton type (90C53) Methods of reduced gradient type (90C52)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- Two modified HS type conjugate gradient methods for unconstrained optimization problems
- A new descent spectral Polak-Ribière-Polyak method based on the memoryless BFGS update
- Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing
- An online conjugate gradient algorithm for large-scale data analysis in machine learning
- Nonlinear conjugate gradient methods for unconstrained optimization
- Some modified Hestenes-Stiefel conjugate gradient algorithms with application in image restoration
- Superlinear convergence of nonlinear conjugate gradient method and scaled memoryless BFGS method based on assumptions about the initial point
- A conjugate gradient algorithm and its applications in image restoration
- Optimization theory and methods. Nonlinear programming
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- A scaled nonlinear conjugate gradient algorithm for unconstrained optimization
- Algorithm 851
- Two-Point Step Size Gradient Methods
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Self-Scaling Variable Metric (SSVM) Algorithms
- Optimal conditioning of self-scaling variable Metric algorithms
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A modified Hestenes–Stiefel conjugate gradient method with an optimal property
- A Two-Term PRP-Based Descent Method
- CUTEr and SifDec
- Methods of conjugate gradients for solving linear systems
- A spectral conjugate gradient method for unconstrained optimization
- Benchmarking optimization software with performance profiles.
- Preconditioned nonlinear conjugate gradient method for micromagnetic energy minimization