An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix
DOI10.1016/j.cam.2017.10.013zbMath1382.90116OpenAlexW2766109040MaRDI QIDQ1677473
Publication date: 21 November 2017
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.cam.2017.10.013
unconstrained optimizationglobal convergenceconjugate gradient methodself-scaling memoryless BFGS matrix
Numerical computation of eigenvalues and eigenvectors of matrices (65F15) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53) Numerical methods based on nonlinear programming (49M37)
Related Items (10)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On three-term conjugate gradient algorithms for unconstrained optimization
- A simple three-term conjugate gradient algorithm for unconstrained optimization
- A new three-term conjugate gradient algorithm for unconstrained optimization
- An optimal parameter for Dai-Liao family of conjugate gradient methods
- Subgradient methods for huge-scale optimization problems
- An improved nonlinear conjugate gradient method with an optimal property
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Fine tuning Nesterov's steepest descent algorithm for differentiable convex programming
- Adaptive restart for accelerated gradient schemes
- Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update
- New quasi-Newton methods for unconstrained optimization problems
- Optimization theory and methods. Nonlinear programming
- The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Two optimal Dai–Liao conjugate gradient methods
- Technical Note—A Modified Conjugate Gradient Algorithm
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- On the Convergence of a New Conjugate Gradient Algorithm
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Some methods of speeding up the convergence of iteration methods
- Convergence Conditions for Ascent Methods
- A Family of Variable-Metric Methods Derived by Variational Means
- A new approach to variable metric algorithms
- Convergence Conditions for Ascent Methods. II: Some Corrections
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- Conditioning of Quasi-Newton Methods for Function Minimization
- The conjugate gradient method in extremal problems
- A descent family of Dai–Liao conjugate gradient methods
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Benchmarking optimization software with performance profiles.
This page was built for publication: An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix