CONVERGENCE PROPERTY AND MODIFICATIONS OF A MEMORY GRADIENT METHOD
From MaRDI portal
Publication:5716132
DOI10.1142/S0217595905000625zbMath1158.90407MaRDI QIDQ5716132
Publication date: 9 January 2006
Published in: Asia-Pacific Journal of Operational Research (Search for Journal in Brave)
Related Items (1)
Uses Software
Cites Work
- A gradient-related algorithm with inexact line searches
- Global convergence of a two-parameter family of conjugate gradient methods without line search
- A truncated Newton method with non-monotone line search for unconstrained optimization
- R-linear convergence of the Barzilai and Borwein gradient method
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Two-Point Step Size Gradient Methods
- Testing Unconstrained Optimization Software
- A Nonmonotone Line Search Technique for Newton’s Method
- On the Barzilai and Borwein choice of steplength for the gradient method
- An efficient hybrid conjugate gradient method for unconstrained optimization
- Global convergence of conjugate gradient methods without line search
This page was built for publication: CONVERGENCE PROPERTY AND MODIFICATIONS OF A MEMORY GRADIENT METHOD