A new family of conjugate gradient methods for unconstrained optimization
From MaRDI portal
Publication:1786950
DOI10.1007/s12190-017-1141-0zbMath1401.90223OpenAlexW2765314055MaRDI QIDQ1786950
Publication date: 25 September 2018
Published in: Journal of Applied Mathematics and Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s12190-017-1141-0
Related Items (6)
A globally convergent projection method for a system of nonlinear monotone equations ⋮ Signal recovery with convex constrained nonlinear monotone equations through conjugate gradient hybrid approach ⋮ An inertial spectral CG projection method based on the memoryless BFGS update ⋮ Unnamed Item ⋮ A family of Hager-Zhang conjugate gradient methods for system of monotone nonlinear equations ⋮ Two modified conjugate gradient methods for unconstrained optimization with applications in image restoration problems
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- An improved spectral conjugate gradient algorithm for nonconvex unconstrained optimization problems
- A three-term conjugate gradient algorithm for large-scale unconstrained optimization problems
- An adaptive conjugate gradient algorithm for large-scale unconstrained optimization
- Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization
- Conjugate gradient algorithms in nonconvex optimization
- Perspectives on self-scaling variable metric algorithms
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- A three-terms Polak-Ribière-Polyak conjugate gradient algorithm for large-scale nonlinear equations
- Optimization theory and methods. Nonlinear programming
- The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices
- Two-Point Step Size Gradient Methods
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Self-Scaling Variable Metric (SSVM) Algorithms
- On the Convergence of a New Conjugate Gradient Algorithm
- Numerical Optimization
- CUTE
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Benchmarking optimization software with performance profiles.
- A new efficient conjugate gradient method for unconstrained optimization
This page was built for publication: A new family of conjugate gradient methods for unconstrained optimization