Novel preconditioners based on quasi-Newton updates for nonlinear conjugate gradient methods
From MaRDI portal
Publication:523183
DOI10.1007/s11590-016-1060-2zbMath1367.90091OpenAlexW2472474713MaRDI QIDQ523183
Caliciotti Andrea, Fasano Giovanni, Roma Massimo
Publication date: 20 April 2017
Published in: Optimization Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11590-016-1060-2
quasi-Newton updatesapproximate inverse preconditionerslarge scale nonconvex optimizationpreconditioned nonlinear conjugate gradient
Large-scale problems in mathematical programming (90C06) Nonconvex programming, global optimization (90C26)
Related Items (8)
New three-term conjugate gradient algorithm for solving monotone nonlinear equations and signal recovery problems ⋮ A link between the steepest descent method and fixed-point iterations ⋮ Preconditioned nonlinear conjugate gradient methods based on a modified secant equation ⋮ Preconditioned nonlinear conjugate gradient method for micromagnetic energy minimization ⋮ Regulation cooperative control for heterogeneous uncertain chaotic systems with time delay: a synchronization errors estimation framework ⋮ Exploiting damped techniques for nonlinear conjugate gradient methods ⋮ On a conjugate directions method for solving strictly convex QP problem ⋮ A variation of Broyden class methods using Householder adaptive transforms
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Preconditioning Newton-Krylov methods in nonconvex large scale optimization
- On the limited memory BFGS method for large scale optimization
- Conjugate gradient algorithms in nonconvex optimization
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- A novel class of approximate inverse preconditioners for large positive definite linear systems in optimization
- Iterative computation of negative curvature directions in large scale optimization
- On A Class of Limited Memory Preconditioners For Large Scale Linear Systems With Multiple Right-Hand Sides
- Sizing and Least-Change Secant Methods
- QN-like variable storage conjugate gradients
- A Relationship between the BFGS and Conjugate Gradient Algorithms and Its Implications for New Algorithms
- Updating Quasi-Newton Matrices with Limited Storage
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Conjugate Gradient Methods with Inexact Searches
- Line search algorithms with guaranteed sufficient decrease
- Automatic Preconditioning by Limited Memory Quasi-Newton Updating
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The Limited Memory Conjugate Gradient Method
- Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Methods of conjugate gradients for solving linear systems
- Benchmarking optimization software with performance profiles.
This page was built for publication: Novel preconditioners based on quasi-Newton updates for nonlinear conjugate gradient methods