Global convergence of conjugate gradient method
From MaRDI portal
Publication:3622012
DOI10.1080/02331930802434633zbMath1158.90406OpenAlexW2254446106MaRDI QIDQ3622012
Publication date: 23 April 2009
Published in: Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/02331930802434633
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37)
Cites Work
- Unnamed Item
- New inexact line search method for unconstrained optimization
- A truncated Newton method with non-monotone line search for unconstrained optimization
- Analysis of monotone gradient methods
- Minimization of functions having Lipschitz continuous first partial derivatives
- Convergence of descent method without line search
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- On the Convergence of a New Conjugate Gradient Algorithm
- Numerical Optimization
- A Nonmonotone Line Search Technique for Newton’s Method
- An Assessment of Nonmonotone Linesearch Techniques for Unconstrained Optimization
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Function minimization by conjugate gradients
- Convergence Conditions for Ascent Methods
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- On the nonmonotone line search
This page was built for publication: Global convergence of conjugate gradient method