Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
From MaRDI portal
Publication:3700719
DOI10.1093/imanum/5.1.121zbMath0578.65063OpenAlexW2158894942MaRDI QIDQ3700719
Publication date: 1985
Published in: IMA Journal of Numerical Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1093/imanum/5.1.121
Related Items
The Hager–Zhang conjugate gradient algorithm for large-scale nonlinear equations ⋮ Nonlinear conjugate gradient methods for the optimal control of laser surface hardening ⋮ Sufficient descent conjugate gradient methods for large-scale optimization problems ⋮ Global convergence of a new sufficient descent spectral three-term conjugate gradient class for large-scale optimization ⋮ A scaled three-term conjugate gradient method for unconstrained optimization ⋮ A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach ⋮ A new class of efficient and globally convergent conjugate gradient methods in the Dai–Liao family ⋮ The proof of the sufficient descent condition of the Wei-Yao-Liu conjugate gradient method under the strong Wolfe-Powell line search ⋮ An improved Hoschek intrinsic parametrization ⋮ Convergence conditions, line search algorithms and trust region implementations for the Polak–Ribière conjugate gradient method ⋮ New conjugate gradient-like methods for unconstrained optimization ⋮ A Subspace Study on Conjugate Gradient Algorithms ⋮ Globally convergence of nonlinear conjugate gradient method for unconstrained optimization ⋮ A decent three term conjugate gradient method with global convergence properties for large scale unconstrained optimization problems ⋮ Unnamed Item ⋮ A structured Fletcher-Revees spectral conjugate gradient method for unconstrained optimization with application in robotic model ⋮ Two diagonal conjugate gradient like methods for unconstrained optimization ⋮ Normalized Wolfe-Powell-type local minimax method for finding multiple unstable solutions of nonlinear elliptic PDEs ⋮ New conjugate gradient method for unconstrained optimization ⋮ A three-term conjugate gradient algorithm with restart procedure to solve image restoration problems ⋮ Preconditioned nonlinear conjugate gradient methods based on a modified secant equation ⋮ Accelerated sparse recovery via gradient descent with nonlinear conjugate gradient momentum ⋮ Modified globally convergent Polak-Ribière-Polyak conjugate gradient methods with self-correcting property for large-scale unconstrained optimization ⋮ On the convergence rate of Fletcher‐Reeves nonlinear conjugate gradient methods satisfying strong Wolfe conditions: Application to parameter identification in problems governed by general dynamics ⋮ Solving Unconstrained Optimization Problems with Some Three-term Conjugate Gradient Methods ⋮ A hybrid HS-LS conjugate gradient algorithm for unconstrained optimization with applications in motion control and image recovery ⋮ Global convergence properties of the BBB conjugate gradient method ⋮ A modified nonlinear Polak-Ribière-Polyak conjugate gradient method with sufficient descent property ⋮ Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search ⋮ Two-step conjugate gradient method for unconstrained optimization ⋮ A modified secant equation quasi-Newton method for unconstrained optimization ⋮ A CONJUGATE GRADIENT-NEURAL NETWORK TECHNIQUE FOR ULTRASOUND INVERSE IMAGING ⋮ A modified Hestenes–Stiefel conjugate gradient method with an optimal property ⋮ A survey of gradient methods for solving nonlinear optimization ⋮ A three-parameter family of nonlinear conjugate gradient methods ⋮ A Computational Approach to Controllability Issues for Flow-Related Models. (I): Pointwise Control of the Viscous Burgers Equation ⋮ An efficient adaptive three-term extension of the Hestenes–Stiefel conjugate gradient method ⋮ A note on global convergence result for conjugate gradient methods ⋮ On the method of shortest residuals for unconstrained optimization ⋮ A globally convergent gradient-like method based on the Armijo line search ⋮ On Conjugate Gradient Algorithms as Objects of Scientific Study ⋮ An efficient hybrid conjugate gradient method for unconstrained optimization ⋮ A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs ⋮ Two fundamental convergence theorems for nonlinear conjugate gradient methods and their applications ⋮ Two descent hybrid conjugate gradient methods for optimization ⋮ Conjugate gradient algorithm and fractals ⋮ Global convergence of the DY conjugate gradient method with Armijo line search for unconstrained optimization problems ⋮ Some descent three-term conjugate gradient methods and their global convergence ⋮ An efficient modified PRP-FR hybrid conjugate gradient method for solving unconstrained optimization problems ⋮ Global convergence of the Polak-Ribière-Polyak conjugate gradient method with an Armijo-type inexact line search for nonconvex unconstrained optimization problems ⋮ Hybrid conjugate gradient methods for unconstrained optimization ⋮ An efficient conjugate direction method with orthogonalization for large-scale quadratic optimization problems ⋮ Global convergence property of \(s\)-dependent GFR conjugate gradient method ⋮ Nonlinear Conjugate Gradient Methods for Vector Optimization ⋮ Modified Hestenes-Steifel conjugate gradient coefficient for unconstrained optimization ⋮ The convergence properties of RMIL+ conjugate gradient method under the strong Wolfe line search ⋮ The complex dynamic of conjugate gradient method ⋮ Some sufficient descent conjugate gradient methods and their global convergence ⋮ A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization ⋮ Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search ⋮ Convergence of the descent Dai–Yuan conjugate gradient method for unconstrained optimization ⋮ A new family of conjugate gradient methods ⋮ A Trust Region Algorithm with Conjugate Gradient Technique for Optimization Problems ⋮ A sufficient descent conjugate gradient method and its global convergence ⋮ Convergence of conjugate gradient methods with constant stepsizes ⋮ Applying the Powell's Symmetrical Technique to Conjugate Gradient Methods with the Generalized Conjugacy Condition ⋮ A modified spectral conjugate gradient method with global convergence ⋮ A new classical conjugate gradient coefficient with exact line search ⋮ Unnamed Item ⋮ Two hybrid nonlinear conjugate gradient methods based on a modified secant equation ⋮ A new two-parameter family of nonlinear conjugate gradient methods ⋮ A new, globally convergent Riemannian conjugate gradient method ⋮ A nonlinear conjugate gradient method based on the MBFGS secant condition ⋮ A hybrid conjugate gradient method with descent property for unconstrained optimization ⋮ A one-parameter class of three-term conjugate gradient methods with an adaptive parameter choice ⋮ A nonmonotone hybrid conjugate gradient method for unconstrained optimization ⋮ An Efficient Hybrid Conjugate Gradient Method for Unconstrained Optimisation ⋮ Comments on ”New hybrid conjugate gradient method as a convex combination of FR and PRP methods” ⋮ Two modified nonlinear conjugate gradient methods with disturbance factors for unconstrained optimization ⋮ A new hybrid conjugate gradient method of unconstrained optimization methods ⋮ A hybrid conjugate gradient method based on a quadratic relaxation of the Dai–Yuan hybrid conjugate gradient parameter ⋮ Extended Dai-Yuan conjugate gradient strategy for large-scale unconstrained optimization with applications to compressive sensing ⋮ A new class of nonlinear conjugate gradient coefficients for unconstrained optimization ⋮ Some modified conjugate gradient methods for unconstrained optimization ⋮ Spectral method and its application to the conjugate gradient method ⋮ A new conjugate gradient hard thresholding pursuit algorithm for sparse signal recovery ⋮ Two efficient modifications of AZPRP conjugate gradient method with sufficient descent property ⋮ A descent hybrid conjugate gradient method based on the memoryless BFGS update ⋮ An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ A hybrid FR-DY conjugate gradient algorithm for unconstrained optimization with application in portfolio selection ⋮ A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems ⋮ Global convergence of the Fletcher-Reeves algorithm with inexact linesearch ⋮ A hybrid conjugate gradient based approach for solving unconstrained optimization and motion control problems ⋮ New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems ⋮ A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications ⋮ New hybrid conjugate gradient method as a convex combination of LS and FR methods ⋮ A new family of globally convergent conjugate gradient methods ⋮ Techniques for gradient-based bilevel optimization with non-smooth lower level problems ⋮ Global convergence properties of the two new dependent Fletcher-Reeves conjugate gradient methods ⋮ On the convergence of \(s\)-dependent GFR conjugate gradient method for unconstrained optimization ⋮ Global convergence of a memory gradient method for unconstrained optimization ⋮ A globally convergent version of the Polak-Ribière conjugate gradient method ⋮ Convergence properties of the dependent PRP conjugate gradient methods ⋮ Adaptive three-term PRP algorithms without gradient Lipschitz continuity condition for nonconvex functions ⋮ A sufficient descent Dai-Yuan type nonlinear conjugate gradient method for unconstrained optimization problems ⋮ A Barzilai and Borwein scaling conjugate gradient method for unconstrained optimization problems ⋮ Efficient hybrid conjugate gradient techniques ⋮ Strong global convergence of an adaptive nonmonotone memory gradient method ⋮ An efficient hybrid conjugate gradient method with the strong Wolfe-Powell line search ⋮ A new conjugate gradient algorithm with sufficient descent property for unconstrained optimization ⋮ Modification of nonlinear conjugate gradient method with weak Wolfe-Powell line search ⋮ Efficient rank reduction of correlation matrices ⋮ New conjugacy condition and related new conjugate gradient methods for unconstrained optimization ⋮ A conjugate gradient method for the unconstrained minimization of strictly convex quadratic splines ⋮ Symmetric Perry conjugate gradient method ⋮ The convergence of conjugate gradient method with nonmonotone line search ⋮ Convergence of Liu-Storey conjugate gradient method ⋮ Convergence and stability of line search methods for unconstrained optimization ⋮ A descent nonlinear conjugate gradient method for large-scale unconstrained optimization ⋮ A new variant of the memory gradient method for unconstrained optimization ⋮ A new class of nonlinear conjugate gradient coefficients with global convergence properties ⋮ Further comment on another hybrid conjugate gradient algorithm for unconstrained optimization by Andrei ⋮ Two modified HS type conjugate gradient methods for unconstrained optimization problems ⋮ New step lengths in conjugate gradient methods ⋮ A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods ⋮ Global convergence of a nonlinear conjugate gradient method ⋮ The projection technique for two open problems of unconstrained optimization problems ⋮ A variant spectral-type FR conjugate gradient method and its global convergence ⋮ Three-term conjugate gradient method for the convex optimization problem over the fixed point set of a nonexpansive mapping ⋮ Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property ⋮ Global convergence of algorithms with nonmonotone line search strategy in unconstrained optimization ⋮ Globally convergent modified Perry's conjugate gradient method ⋮ The global convergence of a new mixed conjugate gradient method for unconstrained optimization ⋮ Efficient generalized conjugate gradient algorithms. I: Theory ⋮ Global convergence of some modified PRP nonlinear conjugate gradient methods ⋮ Two effective hybrid conjugate gradient algorithms based on modified BFGS updates ⋮ A modified conjugacy condition and related nonlinear conjugate gradient method ⋮ A hybrid of DL and WYL nonlinear conjugate gradient methods ⋮ An extension of the Fletcher-Reeves method to linear equality constrained optimization problem ⋮ Further insight into the convergence of the Fletcher-Reeves method ⋮ A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches ⋮ Global convergence of a modified spectral FR conjugate gradient method ⋮ Global convergence of a memory gradient method without line search ⋮ Multi-step nonlinear conjugate gradient methods for unconstrained minimization ⋮ Nonlinear CG-like iterative methods ⋮ Exploiting damped techniques for nonlinear conjugate gradient methods ⋮ A modified three-term PRP conjugate gradient algorithm for optimization models ⋮ Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search ⋮ Some global convergence properties of the Wei-Yao-Liu conjugate gradient method with inexact line search ⋮ A conjugate gradient method for unconstrained optimization problems ⋮ Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems ⋮ A modified CG-DESCENT method for unconstrained optimization ⋮ Applying powell's symmetrical technique to conjugate gradient methods ⋮ A modified nonlinear conjugate gradient method with the Armijo line search and its application ⋮ A modified three-term type CD conjugate gradient algorithm for unconstrained optimization problems ⋮ Two new conjugate gradient methods based on modified secant equations ⋮ A class of modified FR conjugate gradient method and applications to non-negative matrix factorization ⋮ A hybridization of the Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods ⋮ Riemannian conjugate gradient methods with inverse retraction ⋮ Hybrid Riemannian conjugate gradient methods with global convergence properties ⋮ A note about WYL's conjugate gradient method and its applications ⋮ A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique ⋮ An efficient modified AZPRP conjugate gradient method for large-scale unconstrained optimization problem ⋮ A conjugate gradient method with descent direction for unconstrained optimization ⋮ Line search fixed point algorithms based on nonlinear conjugate gradient directions: application to constrained smooth convex optimization ⋮ Sufficient descent Riemannian conjugate gradient methods ⋮ Large sparse continuation problems ⋮ Hybrid conjugate gradient method for a convex optimization problem over the fixed-point set of a nonexpansive mapping ⋮ A generalized conjugate gradient algorithm ⋮ Behavior of the combination of PRP and HZ methods for unconstrained optimization ⋮ A modified PRP conjugate gradient method ⋮ A conjugate gradient algorithm and its application in large-scale optimization problems and image restoration ⋮ Solving unconstrained optimization problems via hybrid CD-DY conjugate gradient methods with applications ⋮ A derivative-based bracketing scheme for univariate minimization and the conjugate gradient method ⋮ Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization ⋮ An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation ⋮ A class of nonmonotone conjugate gradient methods for unconstrained optimization ⋮ Globally convergent Polak-Ribière-Polyak conjugate gradient methods under a modified Wolfe line search ⋮ Two classes of spectral conjugate gradient methods for unconstrained optimizations ⋮ Global convergence result for conjugate gradient methods ⋮ A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations ⋮ Conjugate gradient methods with Armijo-type line searches. ⋮ Global convergence of the Dai-Yuan conjugate gradient method with perturbations
This page was built for publication: Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search