Algorithm 851
From MaRDI portal
Publication:3549169
DOI10.1145/1132973.1132979zbMath1346.90816OpenAlexW2016518303WikidataQ113310696 ScholiaQ113310696MaRDI QIDQ3549169
Hongchao Zhang, William W. Hager
Publication date: 21 December 2008
Published in: ACM Transactions on Mathematical Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1145/1132973.1132979
Nonlinear programming (90C30) Iterative numerical methods for linear systems (65F10) Methods of reduced gradient type (90C52)
Related Items
The Hager–Zhang conjugate gradient algorithm for large-scale nonlinear equations, Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update, A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update, A globally convergent hybrid conjugate gradient method and its numerical behaviors, Two optimal Dai–Liao conjugate gradient methods, An improved three-term conjugate gradient algorithm for solving unconstrained optimization problems, Optimal vaccination strategy for a mean-field stochastic susceptible-infected-vaccinated system, A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach, A modified Hager-Zhang conjugate gradient method with optimal choices for solving monotone nonlinear equations, New conjugate gradient-like methods for unconstrained optimization, A descent extension of the Polak-Ribière-Polyak conjugate gradient method, A Modified Non-Monotone BFGS Method for Non-Convex Unconstrained Optimization, Optimal control of convective FitzHugh-Nagumo equation, A descent Dai-Liao conjugate gradient method for nonlinear equations, Sufficient and necessary conditions of near-optimal controls for a diffusion dengue model with Lévy noise, Computing equilibrium measures with power law kernels, SOBMOR: Structured Optimization-Based Model Order Reduction, An Accelerated Three-Term Extension of a Descent Nonlinear Conjugate Gradient Method, A structured Fletcher-Revees spectral conjugate gradient method for unconstrained optimization with application in robotic model, An efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization, Bayesian calibration for large‐scale fluid structure interaction problems under embedded/immersed boundary framework, A projection-based derivative free DFP approach for solving system of nonlinear convex constrained monotone equations with image restoration applications, Preconditioned nonlinear conjugate gradient methods based on a modified secant equation, Nonmonotone quasi-Newton-based conjugate gradient methods with application to signal processing, A nonlinear conjugate gradient method with complexity guarantees and its application to nonconvex regression, A New Adaptive Conjugate Gradient Algorithm for Large-Scale Unconstrained Optimization, A modified descent Polak-Ribiére-Polyak conjugate gradient method with global convergence property for nonconvex functions, Alternating cyclic vector extrapolation technique for accelerating nonlinear optimization algorithms and fixed-point mapping applications, A family of the modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent and conjugacy conditions, A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization, A NONMONOTONE ADMM-BASED DIAGONAL QUASI-NEWTON UPDATE WITH APPLICATION TO THE COMPRESSIVE SENSING PROBLEM, A new subspace minimization conjugate gradient method for unconstrained minimization, Adaptive multigrid strategy for geometry optimization of large-scale three dimensional molecular mechanics, Memoryless quasi-Newton methods based on the spectral-scaling Broyden family for Riemannian optimization, A restart scheme for the memoryless BFGS method, Globally convergent conjugate gradient algorithms without the Lipschitz condition for nonconvex optimization, A descent family of the spectral Hestenes–Stiefel method by considering the quasi-Newton method, A descent extension of a modified Polak-Ribière-Polyak method with application in image restoration problem, Necessary and sufficient conditions for near‐optimal controls of a stochastic West Nile virus system with spatial diffusion, An overview of nonlinear optimization, A proximal quasi-Newton method based on memoryless modified symmetric rank-one formula, A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization, An adaptive nonmonotone trust region algorithm, Spectral conjugate gradient methods with sufficient descent property for large-scale unconstrained optimization, A modified Hestense–Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton method, A modified conjugate gradient algorithm with backtracking line search technique for large-scale nonlinear equations, Improving the Dai-Liao parameter choices using a fixed point equation, A novel energy-based approach for merging finite elements, An efficient hybrid conjugate gradient method for unconstrained optimization, Adjoint-based optimization of PDEs in moving domains, Structured minimal-memory inexact quasi-Newton method and secant preconditioners for augmented Lagrangian optimization, Self-adaptive inexact proximal point methods, Erratum to: Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, Modeling and control through leadership of a refined flocking system, A new nonmonotone line search technique for unconstrained optimization, A descent family of Dai–Liao conjugate gradient methods, Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, A Two-Term PRP-Based Descent Method, INITIAL IMPROVEMENT OF THE HYBRID ACCELERATED GRADIENT DESCENT PROCESS, Second-order adjoints for solving PDE-constrained optimization problems, An efficient adaptive scaling parameter for the spectral conjugate gradient method, Some sufficient descent conjugate gradient methods and their global convergence, The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices, A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization, A new family of conjugate gradient methods, A sufficient descent Liu–Storey conjugate gradient method and its global convergence, MATRIX ANALYSES ON THE DAI–LIAO CONJUGATE GRADIENT METHOD, Unnamed Item, A descent hybrid modification of the Polak–Ribière–Polyak conjugate gradient method, On the control through leadership of the Hegselmann–Krause opinion formation model, Two hybrid nonlinear conjugate gradient methods based on a modified secant equation, Rate of Convergence of a Restarted CG-DESCENT Method, An adaptive nonmonotone trust region method based on a modified scalar approximation of the Hessian in the successive quadratic subproblems, Riemannian Multigrid Line Search for Low-Rank Problems, A Perry-type derivative-free algorithm for solving nonlinear system of equations and minimizing ℓ1regularized problem, A Modified Hestenes-Stiefel Conjugate Gradient Algorithm for Large-Scale Optimization, A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method, Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization, A nonmonotone scaled conjugate gradient algorithm for large-scale unconstrained optimization, On Hager and Zhang's conjugate gradient method with guaranteed descent, Some modified conjugate gradient methods for unconstrained optimization, Spectral method and its application to the conjugate gradient method, An online conjugate gradient algorithm for large-scale data analysis in machine learning, An optimal parameter for Dai-Liao family of conjugate gradient methods, A hybrid FR-DY conjugate gradient algorithm for unconstrained optimization with application in portfolio selection, A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems, A new subspace minimization conjugate gradient method based on conic model for large-scale unconstrained optimization, Two sufficient descent three-term conjugate gradient methods for unconstrained optimization problems with applications in compressive sensing, LMBOPT: a limited memory method for bound-constrained optimization, A hybrid conjugate gradient based approach for solving unconstrained optimization and motion control problems, New hybrid conjugate gradient method as a convex combination of LS and FR methods, Optimal scaling parameters for spectral conjugate gradient methods, Two accelerated nonmonotone adaptive trust region line search methods, Two extensions of the Dai-Liao method with sufficient descent property based on a penalization scheme, An improved Perry conjugate gradient method with adaptive parameter choice, An extended delayed weighted gradient algorithm for solving strongly convex optimization problems, Adaptive three-term PRP algorithms without gradient Lipschitz continuity condition for nonconvex functions, Nonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty model, A hybrid quasi-Newton method with application in sparse recovery, On the optimal control of the Schlögl-model, An active set trust-region method for bound-constrained optimization, An improved spectral conjugate gradient algorithm for nonconvex unconstrained optimization problems, Symmetric Perry conjugate gradient method, The convergence of conjugate gradient method with nonmonotone line search, Two modified scaled nonlinear conjugate gradient methods, A simple sufficient descent method for unconstrained optimization, An efficient multigrid strategy for large-scale molecular mechanics optimization, Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions, New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method, Inexact restoration method for minimization problems arising in electronic structure calculations, The projection technique for two open problems of unconstrained optimization problems, A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties, A modified conjugate gradient algorithm with cyclic Barzilai-Borwein steplength for unconstrained optimization, A Barzilai-Borwein gradient projection method for sparse signal and blurred image restoration, Using approximate secant equations in limited memory methods for multilevel unconstrained optimization, Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization, Two proposals for robust PCA using semidefinite programming, The global convergence of a new mixed conjugate gradient method for unconstrained optimization, Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization, W-methods in optimal control, A three-term conjugate gradient algorithm with quadratic convergence for unconstrained optimization problems, A survey of gradient methods for solving nonlinear optimization, Sufficient descent conjugate gradient methods for solving convex constrained nonlinear monotone equations, Two modified three-term conjugate gradient methods with sufficient descent property, An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization, A novel value for the parameter in the Dai-Liao-type conjugate gradient method, On the sufficient descent condition of the Hager-Zhang conjugate gradient methods, A class of accelerated subspace minimization conjugate gradient methods, An adaptive conjugate gradient algorithm for large-scale unconstrained optimization, Norm descent conjugate gradient methods for solving symmetric nonlinear equations, Two optimal Hager-Zhang conjugate gradient methods for solving monotone nonlinear equations, Global optimization through a stochastic perturbation of the Polak-Ribière conjugate gradient method, CGRS -- an advanced hybrid method for global optimization of continuous functions closely coupling extended random search and conjugate gradient method, Analysis of the maximum magnification by the scaled memoryless DFP updating formula with application to compressive sensing, On optimality of two adaptive choices for the parameter of Dai-Liao method, Dai-Kou type conjugate gradient methods with a line search only using gradient, A class of adaptive dai-liao conjugate gradient methods based on the scaled memoryless BFGS update, A modified three-term PRP conjugate gradient algorithm for optimization models, A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs, Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search, Conjugate gradient methods using value of objective function for unconstrained optimization, A modified three-term conjugate gradient method with sufficient descent property, A conjugate gradient method for unconstrained optimization problems, A descent conjugate gradient algorithm for optimization problems and its applications in image restoration and compression sensing, Reduced order optimal control of the convective FitzHugh-Nagumo equations, A conjugate gradient method to solve convex constrained monotone equations with applications in compressive sensing, Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length, Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, A modified conjugate gradient method for monotone nonlinear equations with convex constraints, A modified nonmonotone trust region line search method, Optimal vaccination strategy for an SIRS model with imprecise parameters and Lévy noise, A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique, Two adaptive Dai-Liao nonlinear conjugate gradient methods, Quasi-Newton acceleration for equality-constrained minimization, Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems, Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search, Line search fixed point algorithms based on nonlinear conjugate gradient directions: application to constrained smooth convex optimization, A stochastic subspace approach to gradient-free optimization in high dimensions, An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization, A conjugate gradient algorithm and its applications in image restoration, Two limited-memory optimization methods with minimum violation of the previous secant conditions, CG_DESCENT, Truncated trust region method for nonlinear inverse problems and application in full-waveform inversion, Adaptive continuation solid isotropic material with penalization for volume constrained compliance minimization, The global proof of the Polak-Ribière-Polak algorithm under the YWL inexact line search technique, The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions, Solving unconstrained optimization problems via hybrid CD-DY conjugate gradient methods with applications, A subspace minimization conjugate gradient method based on conic model for unconstrained optimization, A new conjugate gradient method with an efficient memory structure, A modified PRP-type conjugate gradient projection algorithm for solving large-scale monotone nonlinear equations with convex constraint, Scaled nonlinear conjugate gradient methods for nonlinear least squares problems, Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing, Convergence analysis of a nonmonotone projected gradient method for multiobjective optimization problems, A nonmonotone scaled Fletcher-Reeves conjugate gradient method with application in image reconstruction, An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix, Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems, Towards a comprehensive approach to optimal control of non-ideal binary batch distillation, A modified HZ conjugate gradient algorithm without gradient Lipschitz continuous condition for non convex functions, Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing
Uses Software