A globally convergent version of the Polak-Ribière conjugate gradient method
From MaRDI portal
Publication:1366426
DOI10.1007/BF02614362zbMath0887.90157MaRDI QIDQ1366426
Publication date: 10 September 1997
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
global convergenceunconstrained minimizationline searchPolak-Ribière conjugate gradient methodnonconvex differentiable functions
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Methods of reduced gradient type (90C52)
Related Items
A modified Broyden family algorithm with global convergence under a weak Wolfe-Powell line search for unconstrained nonconvex problems, A hybrid FR-DY conjugate gradient algorithm for unconstrained optimization with application in portfolio selection, A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems, A hybrid conjugate gradient based approach for solving unconstrained optimization and motion control problems, A PRP-based residual method for large-scale monotone nonlinear equations, New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems, The proof of the sufficient descent condition of the Wei-Yao-Liu conjugate gradient method under the strong Wolfe-Powell line search, Convergence properties of a correlation Polak-Ribiére conjugate gradient method, Convergence properties of a class of nonlinear conjugate gradient methods, Convergence properties of the dependent PRP conjugate gradient methods, A new class of supermemory gradient methods, The convergence properties of some new conjugate gradient methods, A Barzilai and Borwein scaling conjugate gradient method for unconstrained optimization problems, Global convergence of a nonmonotone Broyden family method for nonconvex unconstrained minimization, Convergence of the Polak-Ribiére-Polyak conjugate gradient method, A short note on the global convergence of the unmodified PRP method, A gradient-related algorithm with inexact line searches, A new conjugate gradient algorithm with sufficient descent property for unconstrained optimization, A derivative-free conjugate residual method using secant condition for general large-scale nonlinear equations, Global convergence of a modified conjugate gradient method, The convergence of conjugate gradient method with nonmonotone line search, New spectral PRP conjugate gradient method for unconstrained optimization, Convergence of Liu-Storey conjugate gradient method, A new variant of the memory gradient method for unconstrained optimization, Two modified HS type conjugate gradient methods for unconstrained optimization problems, New step lengths in conjugate gradient methods, A simple sufficient descent method for unconstrained optimization, Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions, A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods, On the extension of the Hager-Zhang conjugate gradient method for vector optimization, Preconditioned nonlinear conjugate gradient methods based on a modified secant equation, Global convergence of a nonlinear conjugate gradient method, Global convergence of a modified spectral conjugate gradient method, A nonlinear conjugate gradient method with complexity guarantees and its application to nonconvex regression, A modified descent Polak-Ribiére-Polyak conjugate gradient method with global convergence property for nonconvex functions, The projection technique for two open problems of unconstrained optimization problems, A hybrid conjugate gradient algorithm for nonconvex functions and its applications in image restoration problems, A modified conjugate gradient algorithm with cyclic Barzilai-Borwein steplength for unconstrained optimization, A hybrid HS-LS conjugate gradient algorithm for unconstrained optimization with applications in motion control and image recovery, Globally convergent conjugate gradient algorithms without the Lipschitz condition for nonconvex optimization, An adaptive projection BFGS method for nonconvex unconstrained optimization problems, \(n\)-step quadratic convergence of the MPRP method with a restart strategy, Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search, Globally convergent modified Perry's conjugate gradient method, AADIS: an atomistic analyzer for dislocation character and distribution, Modified three-term Liu-Storey conjugate gradient method for solving unconstrained optimization problems and image restoration problems, Global convergence of some modified PRP nonlinear conjugate gradient methods, A modified conjugacy condition and related nonlinear conjugate gradient method, Sufficient descent Polak-Ribière-Polyak conjugate gradient algorithm for large-scale box-constrained optimization, A hybrid of DL and WYL nonlinear conjugate gradient methods, Further studies on the Wei-Yao-Liu nonlinear conjugate gradient method, A three-parameter family of nonlinear conjugate gradient methods, Convergence of supermemory gradient method, Modification of the Wolfe line search rules to satisfy the descent condition in the Polak-Ribière-Polyak conjugate gradient method, Convergence of PRP method with new nonmonotone line search, Memory gradient method with Goldstein line search, Exploiting damped techniques for nonlinear conjugate gradient methods, A practical PR+ conjugate gradient method only using gradient, A modified three-term PRP conjugate gradient algorithm for optimization models, Some global convergence properties of the Wei-Yao-Liu conjugate gradient method with inexact line search, A new descent algorithm with curve search rule, A scaled three-term conjugate gradient method for large-scale unconstrained optimization problem, Two new conjugate gradient methods based on modified secant equations, A new globalization technique for nonlinear conjugate gradient methods for nonconvex minimization, On memory gradient method with trust region for unconstrained optimization, Global convergence of the Polak-Ribière-Polyak conjugate gradient method with an Armijo-type inexact line search for nonconvex unconstrained optimization problems, A note about WYL's conjugate gradient method and its applications, Unnamed Item, A conjugate gradient method with descent direction for unconstrained optimization, Some sufficient descent conjugate gradient methods and their global convergence, A new Liu-Storey type nonlinear conjugate gradient method for unconstrained optimization problems, A new family of conjugate gradient methods, A sufficient descent nonlinear conjugate gradient method for solving \(\mathcal{M} \)-tensor equations, The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration problems, A modified PRP conjugate gradient method, The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions, A \(q\)-Polak-Ribière-Polyak conjugate gradient algorithm for unconstrained optimization problems, A spectral KRMI conjugate gradient method under the strong-Wolfe line search, Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization, A spectral conjugate gradient method for nonlinear inverse problems, Globally convergent Polak-Ribière-Polyak conjugate gradient methods under a modified Wolfe line search, Convergence of descent method without line search, A new super-memory gradient method with curve search rule, A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method, Three modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property, A class of line search-type methods for nonsmooth convex regularized minimization, Globally convergent diagonal Polak-Ribière-Polyak like algorithm for nonlinear equations, The modified PRP conjugate gradient algorithm under a non-descent line search and its application in the Muskingum model and image restoration problems, A modified HZ conjugate gradient algorithm without gradient Lipschitz continuous condition for non convex functions, Conjugate gradient methods with Armijo-type line searches.
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Global convergence and stabilization of unconstrained minimization methods without derivatives
- Generalized Polak-Ribière algorithm
- Convergence Properties of Algorithms for Nonlinear Optimization
- Stopping criteria for linesearch methods without derivatives
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Globally convergent conjugate gradient algorithms
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Conjugate Gradient Methods with Inexact Searches
- On the Convergence of a New Conjugate Gradient Algorithm
- Function minimization by conjugate gradients
- Methods of conjugate gradients for solving linear systems