Diagonal BFGS updates and applications to the limited memory BFGS method
From MaRDI portal
Publication:2114834
DOI10.1007/s10589-022-00353-3zbMath1487.90607OpenAlexW4213184394MaRDI QIDQ2114834
Xiaozhou Wang, Jiajian Huang, Dong-hui Li
Publication date: 15 March 2022
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-022-00353-3
Related Items (2)
Nonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty model ⋮ Proximal quasi-Newton method for composite optimization over the Stiefel manifold
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Scaling on diagonal quasi-Newton update for large-scale unconstrained optimization
- Some numerical experiments with variable-storage quasi-Newton algorithms
- On the limited memory BFGS method for large scale optimization
- Modified two-point stepsize gradient methods for unconstrained optimization
- Projected Barzilai-Borwein methods for large-scale box-constrained quadratic programming
- Improved Hessian approximations for the limited memory BFGS method
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- On efficiently combining limited-memory and trust-region techniques
- A diagonal quasi-Newton updating method for unconstrained optimization
- On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
- R-linear convergence of the Barzilai and Borwein gradient method
- Diagonal quasi-Newton method via variational principle under generalized Frobenius norm
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Sizing and Least-Change Secant Methods
- Two-Point Step Size Gradient Methods
- Updating Quasi-Newton Matrices with Limited Storage
- Quasi-Newton Methods, Motivation and Theory
- Gradient Method with Retards and Generalizations
- Numerical Optimization
- Alternate step gradient method*
- A diagonal quasi-Newton updating method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization
- The Quasi-Cauchy Relation and Diagonal Updating
- On the Barzilai and Borwein choice of steplength for the gradient method
- A New Diagonal Quasi-Newton Updating Method With Scaled Forward Finite Differences Directional Derivative for Unconstrained Optimization
- The cyclic Barzilai-–Borwein method for unconstrained optimization
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.
This page was built for publication: Diagonal BFGS updates and applications to the limited memory BFGS method