A hybrid scaling parameter for the scaled memoryless BFGS method based on the ℓ∞ matrix norm
From MaRDI portal
Publication:5031718
DOI10.1080/00207160.2018.1465940zbMath1499.90214OpenAlexW2802047615MaRDI QIDQ5031718
Publication date: 16 February 2022
Published in: International Journal of Computer Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/00207160.2018.1465940
unconstrained optimizationcondition numberline searchdescent propertyscaled memoryless BFGS update\(\ell_\infty\) norm
Numerical mathematical programming methods (65K05) Convex programming (90C25) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53)
Related Items (5)
Nonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty model ⋮ An accelerated descent CG algorithm with clustering the eigenvalues for large-scale nonconvex unconstrained optimization and its application in image restoration problems ⋮ A class of new three-term descent conjugate gradient algorithms for large-scale unconstrained optimization and applications to image restoration problems ⋮ Analysis of the maximum magnification by the scaled memoryless DFP updating formula with application to compressive sensing ⋮ Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A modified scaling parameter for the memoryless BFGS updating formula
- Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization
- Convergence analysis of a modified BFGS method on convex minimizations
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae
- A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods
- Gradient trust region algorithm with limited memory BFGS update for nonsmooth convex minimization
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization
- Optimization theory and methods. Nonlinear programming
- Self-Scaling Variable Metric (SSVM) Algorithms
- Self-Scaling Variable Metric (SSVM) Algorithms
- Optimal conditioning of self-scaling variable Metric algorithms
- CUTEr and SifDec
- Benchmarking optimization software with performance profiles.
This page was built for publication: A hybrid scaling parameter for the scaled memoryless BFGS method based on the ℓ∞ matrix norm