A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems
From MaRDI portal
Publication:897051
DOI10.1007/s10898-015-0310-7zbMath1337.90043OpenAlexW893842695MaRDI QIDQ897051
Publication date: 16 December 2015
Published in: Journal of Global Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10898-015-0310-7
unconstrained optimizationlarge-scale optimizationL-BFGS methodmodified quasi-Newton equationhybridize genetic algorithm
Large-scale problems in mathematical programming (90C06) Approximation methods and heuristics in mathematical programming (90C59)
Related Items (5)
A regularized limited memory BFGS method for large-scale unconstrained optimization and its efficient implementations ⋮ Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions ⋮ A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization ⋮ A new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimization ⋮ An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- New quasi-Newton methods via higher order tensor models
- Hybrid spectral gradient method for the unconstrained minimization problem
- On the limited memory BFGS method for large scale optimization
- Two new conjugate gradient methods based on modified secant equations
- A limited memory BFGS-type method for large-scale unconstrained optimization
- New quasi-Newton equation and related methods for unconstrained optimization
- Representations of quasi-Newton matrices and their use in limited memory methods
- The BFGS method with exact line searches fails for non-convex objective functions
- Improved Hessian approximations for the limited memory BFGS method
- Damped techniques for the limited memory BFGS method for large-scale optimization
- A regularized limited memory BFGS method for nonconvex unconstrained minimization
- New quasi-Newton methods for unconstrained optimization problems
- Truncated-Newton algorithms for large-scale unconstrained optimization
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Convergence of the BFGS Method for $LC^1 $ Convex Constrained Optimization
- Updating Quasi-Newton Matrices with Limited Storage
- A Numerical Study of the Limited Memory BFGS Method and the Truncated-Newton Method for Large Scale Optimization
- Quasi-Newton Methods, Motivation and Theory
- On the Local and Superlinear Convergence of Quasi-Newton Methods
- Convergence Properties of the BFGS Algoritm
- A Limited Memory Algorithm for Bound Constrained Optimization
- CUTEr and SifDec
- A practical update criterion for SQP method
- A modified BFGS method and its global convergence in nonconvex minimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Benchmarking optimization software with performance profiles.
- Global convergence of a regularized factorized quasi-Newton method for nonlinear least squares problems
This page was built for publication: A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems