A double parameter scaled BFGS method for unconstrained optimization
From MaRDI portal
Publication:1677470
DOI10.1016/j.cam.2017.10.009zbMath1422.65084OpenAlexW2766909038MaRDI QIDQ1677470
Publication date: 21 November 2017
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.cam.2017.10.009
unconstrained optimizationglobal convergencenumerical comparisonsscaled BFGS methodself-correcting quality
Related Items
Another three-term conjugate gradient method close to the memoryless BFGS for large-scale unconstrained optimization problems, The global convergence of the BFGS method with a modified WWP line search for nonconvex functions, A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization, The global convergence of the BFGS method under a modified Yuan-Wei-Lu line search technique, A modified conjugate gradient method based on the self-scaling memoryless BFGS update, Global convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimization, Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length, A structured quasi-Newton algorithm with nonmonotone search strategy for structured NLS problems and its application in robotic motion control
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Spectral scaling BFGS method
- Analysis of a self-scaling quasi-Newton method
- Convergence analysis of a modified BFGS method on convex minimizations
- The global convergence of partitioned BFGS on problems with convex decompositions and Lipschitzian gradients
- Numerical expirience with a class of self-scaling quasi-Newton algorithms
- Modifying the BFGS update by a new column scaling technique
- Modifying the BFGS method
- Global and superlinear convergence of a restricted class of self-scaling methods with inexact line searches, for convex functions
- The BFGS method with exact line searches fails for non-convex objective functions
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- Optimization theory and methods. Nonlinear programming
- Variable metric algorithms: Necessary and sufficient conditions for identical behaviour of nonquadratic functions
- Reduced-Hessian Quasi-Newton Methods for Unconstrained Optimization
- Automatic Column Scaling Strategies for Quasi-Newton Methods
- A Note on Minimization Algorithms which make Use of Non-quardratic Properties of the Objective Function
- A Modified BFGS Algorithm for Unconstrained Optimization
- How bad are the BFGS and DFP methods when the objective function is quadratic?
- Two-Point Step Size Gradient Methods
- Updating conjugate directions by the BFGS formula
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- On the Behavior of Broyden’s Class of Quasi-Newton Methods
- Self-Scaling Variable Metric (SSVM) Algorithms
- Quasi-Newton Methods, Motivation and Theory
- Matrix conditioning and nonlinear optimization
- Conjugate Gradient Methods with Inexact Searches
- CUTE
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- Convergence Properties of the BFGS Algoritm
- Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization
- Convergence Conditions for Ascent Methods
- A Family of Variable-Metric Methods Derived by Variational Means
- A new approach to variable metric algorithms
- Convergence Conditions for Ascent Methods. II: Some Corrections
- On the Convergence of the Variable Metric Algorithm
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- Conditioning of Quasi-Newton Methods for Function Minimization
- Minimization Algorithms Making Use of Non-quadratic Properties of the Objective Function
- Methods of conjugate gradients for solving linear systems
- A modified BFGS method and its global convergence in nonconvex minimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Benchmarking optimization software with performance profiles.