Memory gradient method with Goldstein line search
From MaRDI portal
Publication:2469911
DOI10.1016/j.camwa.2007.02.001zbMath1175.90416OpenAlexW2029306216MaRDI QIDQ2469911
Publication date: 11 February 2008
Published in: Computers \& Mathematics with Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.camwa.2007.02.001
Uses Software
Cites Work
- The use of alternation and recurrences in two-step quasi-Newton methods
- Three-step fixed-point quasi-Newton methods for unconstrained optimisation
- Efficient generalized conjugate gradient algorithms. I: Theory
- Stepsize analysis for descent methods
- Global convergence result for conjugate gradient methods
- Minimum curvature multistep quasi-Newton methods
- A globally convergent version of the Polak-Ribière conjugate gradient method
- Nonmonotone globalization techniques for the Barzilai-Borwein gradient method
- A limited-memory multipoint symmetric secant method for bound constrained optimization
- Improved Hessian approximations for the limited memory BFGS method
- A class on nonmonotone stabilization methods in unconstrained optimization
- Memory gradient method for the minimization of functions
- Relation between the memory gradient method and the Fletcher-Reeves method
- Study on a supermemory gradient method for the minimization of functions
- Quadratically convergent algorithms and one-dimensional search schemes
- A new super-memory gradient method with curve search rule
- A three-parameter family of nonlinear conjugate gradient methods
- R-linear convergence of the Barzilai and Borwein gradient method
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Convergence of multi-step curve search method for unconstrained optimization
- Convergence properties of the Fletcher-Reeves method
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- On the Barzilai and Borwein choice of steplength for the gradient method
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- On Steepest Descent
- Conjugate Directions without Linear Searches
- CONVERGENCE PROPERTY AND MODIFICATIONS OF A MEMORY GRADIENT METHOD
- Implicit updates in multistep quasi-Newton methods
- A nonlinear model for function-value multistep methods
- Adaptive two-point stepsize gradient algorithm
- Relaxed steepest descent and Cauchy-Barzilai-Borwein method
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Memory gradient method with Goldstein line search