A descent hybrid conjugate gradient method based on the memoryless BFGS update
From MaRDI portal
Publication:1625764
DOI10.1007/s11075-018-0479-1OpenAlexW2790222781MaRDI QIDQ1625764
Vassilis Tampakas, Ioannis E. Livieris, Panagiotis Pintelas
Publication date: 29 November 2018
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-018-0479-1
unconstrained optimizationglobal convergenceconjugate gradient methodfrobenious normself-scaled memoryless BFGS
Related Items (11)
An efficient hybrid conjugate gradient method with sufficient descent property for unconstrained optimization ⋮ Nonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty model ⋮ A hybrid quasi-Newton method with application in sparse recovery ⋮ A modified conjugate gradient parameter via hybridization approach for solving large-scale systems of nonlinear equations ⋮ A three-term conjugate gradient method with a random parameter for large-scale unconstrained optimization and its application in regression model ⋮ A new hybrid conjugate gradient algorithm based on the Newton direction to solve unconstrained optimization problems ⋮ Unnamed Item ⋮ Unnamed Item ⋮ A globally convergent gradient-like method based on the Armijo line search ⋮ Improved conjugate gradient method for nonlinear system of equations ⋮ Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A limited memory descent Perry conjugate gradient method
- A class of adaptive dai-liao conjugate gradient methods based on the scaled memoryless BFGS update
- Conjugate gradient method for rank deficient saddle point problems
- Two effective hybrid conjugate gradient algorithms based on modified BFGS updates
- Two minimal positive bases based direct search conjugate gradient methods for computationally expensive functions
- Analysis of a self-scaling quasi-Newton method
- A new conjugate gradient algorithm for training neural networks based on a modified secant equation
- On the limited memory BFGS method for large scale optimization
- Efficient generalized conjugate gradient algorithms. I: Theory
- A wavelet-based nested iteration-inexact conjugate gradient algorithm for adaptively solving elliptic PDEs
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
- Hybrid conjugate gradient algorithm for unconstrained optimization
- Efficient hybrid conjugate gradient techniques
- Numerical expirience with a class of self-scaling quasi-Newton algorithms
- The conjugate gradient method for linear ill-posed problems with operator perturbations
- Global convergence result for conjugate gradient methods
- Global and superlinear convergence of a restricted class of self-scaling methods with inexact line searches, for convex functions
- Iterative accelerating algorithms with Krylov subspaces for the solution to large-scale nonlinear problems
- A hybridization of the Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods
- A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- Another hybrid conjugate gradient algorithm for unconstrained optimization
- Two descent hybrid conjugate gradient methods for optimization
- The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices
- Broyden's quasi-Newton methods for a nonlinear system of equations and unconstrained optimization: a review and open problems
- Numerical Experience with Limited-Memory Quasi-Newton and Truncated Newton Methods
- Newton-Type Minimization via the Lanczos Method
- Two optimal Dai–Liao conjugate gradient methods
- A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach
- Comparison of advanced large-scale minimization algorithms for the solution of inverse ill-posed problems
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Preconditioning of Truncated-Newton Methods
- Updating Quasi-Newton Matrices with Limited Storage
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Self-Scaling Variable Metric (SSVM) Algorithms
- Optimal conditioning of self-scaling variable Metric algorithms
- Restart procedures for the conjugate gradient method
- On the Convergence of a New Conjugate Gradient Algorithm
- Numerical Optimization
- CUTE
- Extra updates for the bfgs method∗
- TWO MODIFIED HYBRID CONJUGATE GRADIENT METHODS BASED ON A HYBRID SECANT EQUATION
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Methods of conjugate gradients for solving linear systems
- Benchmarking optimization software with performance profiles.
- An efficient hybrid conjugate gradient method for unconstrained optimization
- Enriched methods for large-scale unconstrained optimization
This page was built for publication: A descent hybrid conjugate gradient method based on the memoryless BFGS update