Global convergence of a memory gradient method without line search
From MaRDI portal
Publication:949285
DOI10.1007/s12190-007-0021-4zbMath1193.90214OpenAlexW2069863795MaRDI QIDQ949285
Publication date: 21 October 2008
Published in: Journal of Applied Mathematics and Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s12190-007-0021-4
Related Items (3)
An ODE-based trust region method for unconstrained optimization problems ⋮ A new supermemory gradient method for unconstrained optimization problems ⋮ The new spectral conjugate gradient method for large-scale unconstrained optimisation
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Global convergence of a two-parameter family of conjugate gradient methods without line search
- Global convergence of a memory gradient method for unconstrained optimization
- Memory gradient method for the minimization of functions
- Study on a supermemory gradient method for the minimization of functions
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- A new class of memory gradient methods with inexact line searches
- On a profit maximizing location model
This page was built for publication: Global convergence of a memory gradient method without line search