The global convergence of the BFGS method with a modified WWP line search for nonconvex functions
From MaRDI portal
Publication:2163462
DOI10.1007/s11075-022-01265-3zbMath1496.65079OpenAlexW4226531457MaRDI QIDQ2163462
Junyu Lu, Pengyuan Li, Gong Lin Yuan
Publication date: 10 August 2022
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-022-01265-3
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Methods of quasi-Newton type (90C53)
Related Items (2)
Globally convergent conjugate gradient algorithms without the Lipschitz condition for nonconvex optimization ⋮ Convergence analysis of an improved BFGS method and its application in the Muskingum model
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- New cautious BFGS algorithm based on modified Armijo-type line search
- Convergence analysis of a modified BFGS method on convex minimizations
- The superlinear convergence analysis of a nonmonotone BFGS algorithm on convex objective functions
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- The global convergence of partitioned BFGS on problems with convex decompositions and Lipschitzian gradients
- A class of parameter estimation methods for nonlinear Muskingum model using hybrid invasive weed optimization algorithm
- A double parameter scaled BFGS method for unconstrained optimization
- A new restarting adaptive trust-region method for unconstrained optimization
- A conjugate gradient algorithm under Yuan-Wei-Lu line search technique for large-scale minimization optimization models
- Local convergence analysis for partitioned quasi-Newton updates
- A class of gradient unconstrained minimization algorithms with adaptive stepsize
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- The global convergence of a modified BFGS method for nonconvex functions
- New line search methods for unconstrained optimization
- Global convergence of the partitioned BFGS algorithm for convex partially separable optimization
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Testing Unconstrained Optimization Software
- CUTE
- On the Local and Superlinear Convergence of Quasi-Newton Methods
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- Convergence Properties of the BFGS Algoritm
- One-step and multistep procedures for constrained minimization problems
- New BFGS method for unconstrained optimization problem based on modified Armijo line search
- A Family of Variable-Metric Methods Derived by Variational Means
- The Convergence of a Class of Double-rank Minimization Algorithms
- A new approach to variable metric algorithms
- Conditioning of Quasi-Newton Methods for Function Minimization
This page was built for publication: The global convergence of the BFGS method with a modified WWP line search for nonconvex functions