A New Diagonal Quasi-Newton Updating Method With Scaled Forward Finite Differences Directional Derivative for Unconstrained Optimization
From MaRDI portal
Publication:5384612
DOI10.1080/01630563.2018.1552293zbMath1417.49037OpenAlexW2945911381WikidataQ127835652 ScholiaQ127835652MaRDI QIDQ5384612
Publication date: 24 June 2019
Published in: Numerical Functional Analysis and Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/01630563.2018.1552293
unconstrained optimizationglobal convergencenumerical comparisonsdirectional derivativesdiagonal quasi-Newton updating
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Newton-type methods (49M15)
Related Items
A new accelerated diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization, Diagonal BFGS updates and applications to the limited memory BFGS method
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Scaling on diagonal quasi-Newton update for large-scale unconstrained optimization
- Some numerical experiments with variable-storage quasi-Newton algorithms
- On the limited memory BFGS method for large scale optimization
- Accumulative approach in multistep diagonal gradient-type method for large-scale unconstrained optimization
- Linear and nonlinear programming
- Continuous nonlinear optimization for engineering applications in GAMS technology
- A diagonal quasi-Newton updating method for unconstrained optimization
- An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion
- Sizing and Least-Change Secant Methods
- Preconditioning of Truncated-Newton Methods
- Two-Point Step Size Gradient Methods
- Updating Quasi-Newton Matrices with Limited Storage
- Quasi-Newton Methods, Motivation and Theory
- CUTE
- Line search algorithms with guaranteed sufficient decrease
- A diagonal quasi-Newton updating method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization
- The Quasi-Cauchy Relation and Diagonal Updating
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- Convergence Conditions for Ascent Methods
- A Family of Variable-Metric Methods Derived by Variational Means
- A new approach to variable metric algorithms
- Convergence Conditions for Ascent Methods. II: Some Corrections
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- Conditioning of Quasi-Newton Methods for Function Minimization
- Benchmarking optimization software with performance profiles.