On the sufficient descent property of the Shanno's conjugate gradient method
From MaRDI portal
Publication:1947631
DOI10.1007/s11590-012-0462-zzbMath1269.90106OpenAlexW1985539008MaRDI QIDQ1947631
Publication date: 23 April 2013
Published in: Optimization Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11590-012-0462-z
Related Items
A restart scheme for the memoryless BFGS method, A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Scaled conjugate gradient algorithms for unconstrained optimization
- Optimization theory and methods. Nonlinear programming
- Technical Note—A Modified Conjugate Gradient Algorithm
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Conjugate Gradient Methods with Inexact Searches
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Convergence Conditions for Ascent Methods
- A Family of Variable-Metric Methods Derived by Variational Means
- The Convergence of a Class of Double-rank Minimization Algorithms
- A new approach to variable metric algorithms
- Convergence Conditions for Ascent Methods. II: Some Corrections
- Conditioning of Quasi-Newton Methods for Function Minimization
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A spectral conjugate gradient method for unconstrained optimization
- Mathematical theory of optimization