A family of limited memory three term conjugate gradient methods
From MaRDI portal
Publication:6661110
DOI10.1080/10556788.2024.2329591MaRDI QIDQ6661110
Publication date: 10 January 2025
Published in: Optimization Methods \& Software (Search for Journal in Brave)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- An optimal parameter for Dai-Liao family of conjugate gradient methods
- A limited memory descent Perry conjugate gradient method
- Two modified HS type conjugate gradient methods for unconstrained optimization problems
- A sufficient descent LS conjugate gradient method for unconstrained optimization problems
- Convergence of Liu-Storey conjugate gradient method
- On the limited memory BFGS method for large scale optimization
- A new Liu-Storey type nonlinear conjugate gradient method for unconstrained optimization problems
- A descent hybrid conjugate gradient method based on the memoryless BFGS update
- Modern numerical nonlinear optimization
- Nonlinear conjugate gradient methods for unconstrained optimization
- An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- New quasi-Newton methods for unconstrained optimization problems
- Reduced-Hessian quasi-Newton methods for unconstrained optimization
- A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Algorithm 851
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Two-Point Step Size Gradient Methods
- Updating Quasi-Newton Matrices with Limited Storage
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Conjugate Gradient Methods with Inexact Searches
- On the Convergence of a New Conjugate Gradient Algorithm
- CUTE
- Limited-Memory Reduced-Hessian Methods for Large-Scale Unconstrained Optimization
- A scaled conjugate gradient method for nonlinear unconstrained optimization
- A modified Hestense–Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton method
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The Limited Memory Conjugate Gradient Method
- Some descent three-term conjugate gradient methods and their global convergence
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.
- A new efficient conjugate gradient method for unconstrained optimization
This page was built for publication: A family of limited memory three term conjugate gradient methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6661110)