A modified HZ conjugate gradient algorithm without gradient Lipschitz continuous condition for non convex functions
From MaRDI portal
Publication:2103175
DOI10.1007/s12190-022-01724-zzbMath1502.65035OpenAlexW4293100925MaRDI QIDQ2103175
Mengxiang Zhang, Jiajia Yu, Ailun Jian, Gong Lin Yuan
Publication date: 13 December 2022
Published in: Journal of Applied Mathematics and Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s12190-022-01724-z
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Methods of reduced gradient type (90C52)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- A conjugate gradient method with descent direction for unconstrained optimization
- The convergence properties of some new conjugate gradient methods
- Efficient generalized conjugate gradient algorithms. I: Theory
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- BFGS trust-region method for symmetric nonlinear equations
- A modified PRP conjugate gradient method
- A globally convergent version of the Polak-Ribière conjugate gradient method
- A new adaptive trust region algorithm for optimization problems
- A class of parameter estimation methods for nonlinear Muskingum model using hybrid invasive weed optimization algorithm
- An adaptive trust region algorithm for large-residual nonsmooth least squares problems
- A novel parameter estimation method for muskingum model using new Newton-type trust region algorithm
- A conjugate gradient algorithm under Yuan-Wei-Lu line search technique for large-scale minimization optimization models
- An effective adaptive trust region algorithm for nonsmooth minimization
- The projection technique for two open problems of unconstrained optimization problems
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions
- A three-terms Polak-Ribière-Polyak conjugate gradient algorithm for large-scale nonlinear equations
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- Scaled conjugate gradient algorithms for unconstrained optimization
- Convergence Properties of Algorithms for Nonlinear Optimization
- A class of derivative-free methods for large-scale nonlinear monotone equations
- A new trust-region method with line search for solving symmetric nonlinear equations
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Algorithm 851
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Convergence Properties of the BFGS Algoritm
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- A method for the solution of certain non-linear problems in least squares
- A spectral conjugate gradient method for unconstrained optimization
This page was built for publication: A modified HZ conjugate gradient algorithm without gradient Lipschitz continuous condition for non convex functions