Exploiting damped techniques for nonlinear conjugate gradient methods
From MaRDI portal
Publication:684134
DOI10.1007/s00186-017-0593-1zbMath1390.90388OpenAlexW2617669545MaRDI QIDQ684134
Andrea Caliciotti, Massimo Roma, Mehiddin Al-Baali, Giovanni Fasano
Publication date: 9 February 2018
Published in: Mathematical Methods of Operations Research (Search for Journal in Brave)
Full work available at URL: http://hdl.handle.net/10278/3694943
nonlinear conjugate gradient methodsquasi-Newton updateslarge scale unconstrained optimizationdamped techniques
Related Items
An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition, A Class of Approximate Inverse Preconditioners Based on Krylov-Subspace Methods for Large-Scale Nonconvex Optimization, A new conjugate gradient method based on quasi-Newton equation for unconstrained optimization, A sufficient descent nonlinear conjugate gradient method for solving \(\mathcal{M} \)-tensor equations, Real-time pricing method for smart grid based on social welfare maximization model
Uses Software
Cites Work
- Preconditioning Newton-Krylov methods in nonconvex large scale optimization
- Novel preconditioners based on quasi-Newton updates for nonlinear conjugate gradient methods
- Conjugate gradient algorithms in nonconvex optimization
- A globally convergent version of the Polak-Ribière conjugate gradient method
- Damped techniques for the limited memory BFGS method for large-scale optimization
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- A novel class of approximate inverse preconditioners for large positive definite linear systems in optimization
- Damped techniques for enforcing convergence of quasi-Newton methods
- Broyden's quasi-Newton methods for a nonlinear system of equations and unconstrained optimization: a review and open problems
- On A Class of Limited Memory Preconditioners For Large Scale Linear Systems With Multiple Right-Hand Sides
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- How bad are the BFGS and DFP methods when the objective function is quadratic?
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Algorithms for nonlinear constraints that use lagrangian functions
- Line search algorithms with guaranteed sufficient decrease
- Automatic Preconditioning by Limited Memory Quasi-Newton Updating
- On the Order of Convergence of Preconditioned Nonlinear Conjugate Gradient Methods
- Convergence conditions, line search algorithms and trust region implementations for the Polak–Ribière conjugate gradient method
- The Limited Memory Conjugate Gradient Method
- Benchmarking optimization software with performance profiles.
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item