A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization
DOI10.1007/s10957-018-1288-3zbMath1398.49025OpenAlexW2801253117WikidataQ129884028 ScholiaQ129884028MaRDI QIDQ1670017
Publication date: 4 September 2018
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10957-018-1288-3
global convergencenonlinear programmingnumerical comparisonsmeasure function of Byrd and Nocedalscaling BFGS method
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37)
Related Items (3)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A new three-term conjugate gradient algorithm for unconstrained optimization
- A modified scaling parameter for the memoryless BFGS updating formula
- New cautious BFGS algorithm based on modified Armijo-type line search
- An adaptive conjugate gradient algorithm for large-scale unconstrained optimization
- Spectral scaling BFGS method
- An adaptive scaled BFGS method for unconstrained optimization
- Analysis of a self-scaling quasi-Newton method
- Convergence analysis of a modified BFGS method on convex minimizations
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae
- The global convergence of partitioned BFGS on problems with convex decompositions and Lipschitzian gradients
- Numerical expirience with a class of self-scaling quasi-Newton algorithms
- New quasi-Newton equation and related methods for unconstrained optimization
- Sizing the BFGS and DFP updates: Numerical study
- Modifying the BFGS method
- The BFGS method with exact line searches fails for non-convex objective functions
- A double parameter scaled BFGS method for unconstrained optimization
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- Local and superlinear convergence of quasi-Newton methods based on modified secant conditions
- Continuous nonlinear optimization for engineering applications in GAMS technology
- The global convergence of a modified BFGS method for nonconvex functions
- Variable metric algorithms: Necessary and sufficient conditions for identical behaviour of nonquadratic functions
- On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
- Reduced-Hessian Quasi-Newton Methods for Unconstrained Optimization
- A Note on Minimization Algorithms which make Use of Non-quardratic Properties of the Objective Function
- A Modified BFGS Algorithm for Unconstrained Optimization
- Quasi-Newton Updates in Abstract Vector Spaces
- How bad are the BFGS and DFP methods when the objective function is quadratic?
- Two-Point Step Size Gradient Methods
- Updating conjugate directions by the BFGS formula
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- On the Behavior of Broyden’s Class of Quasi-Newton Methods
- Self-Scaling Variable Metric (SSVM) Algorithms
- Optimal conditioning of self-scaling variable Metric algorithms
- Quasi-Newton Methods, Motivation and Theory
- Matrix conditioning and nonlinear optimization
- An extended Dai-Liao conjugate gradient method with global convergence for nonconvex functions
- On Sizing and Shifting the BFGS Update within the Sized-Broyden Family of Secant Updates
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- Convergence Properties of the BFGS Algoritm
- Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization
- New BFGS method for unconstrained optimization problem based on modified Armijo line search
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- On the Convergence of the Variable Metric Algorithm
- Minimization Algorithms Making Use of Non-quadratic Properties of the Objective Function
- Methods of conjugate gradients for solving linear systems
- A modified BFGS method and its global convergence in nonconvex minimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Benchmarking optimization software with performance profiles.
This page was built for publication: A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization