Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
From MaRDI portal
Publication:1001321
DOI10.1007/s11590-008-0086-5zbMath1154.90623OpenAlexW1998169401MaRDI QIDQ1001321
Publication date: 17 February 2009
Published in: Optimization Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11590-008-0086-5
Related Items (77)
The Hager–Zhang conjugate gradient algorithm for large-scale nonlinear equations ⋮ Another three-term conjugate gradient method close to the memoryless BFGS for large-scale unconstrained optimization problems ⋮ A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems ⋮ Globally convergent three-term conjugate gradient projection methods for solving nonlinear monotone equations ⋮ A family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS method ⋮ A new adaptive trust region algorithm for optimization problems ⋮ A modified sufficient descent Polak-Ribiére-Polyak type conjugate gradient method for unconstrained optimization problems ⋮ A modified Wei-Yao-Liu conjugate gradient method for unconstrained optimization ⋮ An improved Perry conjugate gradient method with adaptive parameter choice ⋮ Adaptive three-term PRP algorithms without gradient Lipschitz continuity condition for nonconvex functions ⋮ The global convergence of the BFGS method with a modified WWP line search for nonconvex functions ⋮ Modified nonlinear conjugate gradient method with sufficient descent condition for unconstrained optimization ⋮ A descent extension of the Polak-Ribière-Polyak conjugate gradient method ⋮ A modified nonmonotone BFGS algorithm for unconstrained optimization ⋮ An improved spectral conjugate gradient algorithm for nonconvex unconstrained optimization problems ⋮ Global convergence of a modified Broyden family method for nonconvex functions ⋮ Nonlinear conjugate gradient methods with Wolfe type line search ⋮ A self-adjusting spectral conjugate gradient method for large-scale unconstrained optimization ⋮ A distributed conjugate gradient online learning method over networks ⋮ On global convergence of gradient descent algorithms for generalized phase retrieval problem ⋮ Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions ⋮ A class of accelerated conjugate-gradient-like methods based on a modified secant equation ⋮ A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods ⋮ A modified PRP-type conjugate gradient algorithm with complexity analysis and its application to image restoration problems ⋮ Global convergence of a modified spectral conjugate gradient method ⋮ A modified descent Polak-Ribiére-Polyak conjugate gradient method with global convergence property for nonconvex functions ⋮ The projection technique for two open problems of unconstrained optimization problems ⋮ Globally convergent conjugate gradient algorithms without the Lipschitz condition for nonconvex optimization ⋮ A descent extension of a modified Polak-Ribière-Polyak method with application in image restoration problem ⋮ An active set limited memory BFGS algorithm for bound constrained optimization ⋮ The global convergence of a new mixed conjugate gradient method for unconstrained optimization ⋮ Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization ⋮ Global convergence of some modified PRP nonlinear conjugate gradient methods ⋮ Unnamed Item ⋮ A three-term conjugate gradient algorithm with quadratic convergence for unconstrained optimization problems ⋮ Extension of modified Polak-Ribière-Polyak conjugate gradient method to linear equality constraints minimization problems ⋮ Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence ⋮ A modified conjugate gradient algorithm with backtracking line search technique for large-scale nonlinear equations ⋮ Global optimization through a stochastic perturbation of the Polak-Ribière conjugate gradient method ⋮ A quasi-Newton algorithm for large-scale nonlinear equations ⋮ A Modified PRP Conjugate Gradient Algorithm with Trust Region for Optimization Problems ⋮ A BFGS algorithm for solving symmetric nonlinear equations ⋮ A modified three-term PRP conjugate gradient algorithm for optimization models ⋮ A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs ⋮ Conjugate gradient methods using value of objective function for unconstrained optimization ⋮ A new supermemory gradient method for unconstrained optimization problems ⋮ A modified three-term conjugate gradient method with sufficient descent property ⋮ A conjugate gradient method for unconstrained optimization problems ⋮ A Kronecker approximation with a convex constrained optimization method for blind image restoration ⋮ A multimethod technique for solving optimal control problem ⋮ A descent conjugate gradient algorithm for optimization problems and its applications in image restoration and compression sensing ⋮ A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems ⋮ A hybridization of the Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods ⋮ An efficient modified PRP-FR hybrid conjugate gradient method for solving unconstrained optimization problems ⋮ Optimization for limited angle tomography in medical image processing ⋮ INITIAL IMPROVEMENT OF THE HYBRID ACCELERATED GRADIENT DESCENT PROCESS ⋮ Two Modified Polak–Ribière–Polyak-Type Nonlinear Conjugate Methods with Sufficient Descent Property ⋮ A New Method with Descent Property for Symmetric Nonlinear Equations ⋮ A conjugate gradient method with descent direction for unconstrained optimization ⋮ Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search ⋮ Two modified DY conjugate gradient methods for unconstrained optimization problems ⋮ A nonmonotone supermemory gradient algorithm for unconstrained optimization ⋮ A Trust Region Algorithm with Conjugate Gradient Technique for Optimization Problems ⋮ A conjugate gradient algorithm and its applications in image restoration ⋮ A descent hybrid modification of the Polak–Ribière–Polyak conjugate gradient method ⋮ The global proof of the Polak-Ribière-Polak algorithm under the YWL inexact line search technique ⋮ The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions ⋮ A \(q\)-Polak-Ribière-Polyak conjugate gradient algorithm for unconstrained optimization problems ⋮ Global convergence of a descent PRP type conjugate gradient method for nonconvex optimization ⋮ An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation ⋮ A Multimethod Technique for Solving Optimal Control Problem ⋮ Globally convergent Polak-Ribière-Polyak conjugate gradient methods under a modified Wolfe line search ⋮ A Modified Hestenes-Stiefel Conjugate Gradient Algorithm for Large-Scale Optimization ⋮ A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method ⋮ A modified HZ conjugate gradient algorithm without gradient Lipschitz continuous condition for non convex functions ⋮ A three-terms Polak-Ribière-Polyak conjugate gradient algorithm for large-scale nonlinear equations ⋮ A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
Uses Software
Cites Work
- A nonmonotone conjugate gradient algorithm for unconstrained optimization
- New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems
- The convergence properties of some new conjugate gradient methods
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- Convergence Properties of Algorithms for Nonlinear Optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Algorithm 851
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Benchmarking optimization software with performance profiles.
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems