Pages that link to "Item:Q970585"
From MaRDI portal
The following pages link to Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization (Q970585):
Displaying 22 items.
- New hybrid conjugate gradient method for unconstrained optimization (Q278565) (← links)
- Two modified scaled nonlinear conjugate gradient methods (Q390466) (← links)
- Two effective hybrid conjugate gradient algorithms based on modified BFGS updates (Q645035) (← links)
- A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique (Q723782) (← links)
- A descent hybrid conjugate gradient method based on the memoryless BFGS update (Q1625764) (← links)
- An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization (Q1751056) (← links)
- A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods (Q1937015) (← links)
- New hybrid conjugate gradient method as a convex combination of LS and FR methods (Q2150719) (← links)
- A class of accelerated conjugate-gradient-like methods based on a modified secant equation (Q2190281) (← links)
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization (Q2267641) (← links)
- A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function (Q2327437) (← links)
- Comments on ``Another hybrid conjugate gradient algorithm for unconstrained optimization'' by Andrei (Q2351487) (← links)
- Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization (Q2390003) (← links)
- Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization (Q2406313) (← links)
- A hybrid Riemannian conjugate gradient method for nonconvex optimization problems (Q2700130) (← links)
- A hybrid conjugate gradient method based on a quadratic relaxation of the Dai–Yuan hybrid conjugate gradient parameter (Q2868907) (← links)
- A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach (Q3458811) (← links)
- A Conjugate Gradient Method Based on a Modified Secant Relation for Unconstrained Optimization (Q4959904) (← links)
- An efficient hybrid conjugate gradient method for unconstrained optimization (Q5058378) (← links)
- An efficient hybrid conjugate gradient method with sufficient descent property for unconstrained optimization (Q5058391) (← links)
- Comments on “A hybrid conjugate gradient method based on a quadratic relaxation of the Dai-Yuan hybrid conjugate gradient parameter” (Q5248230) (← links)
- Two hybrid nonlinear conjugate gradient methods based on a modified secant equation (Q5495572) (← links)