Deprecated: $wgMWOAuthSharedUserIDs=false is deprecated, set $wgMWOAuthSharedUserIDs=true, $wgMWOAuthSharedUserSource='local' instead [Called from MediaWiki\HookContainer\HookContainer::run in /var/www/html/w/includes/HookContainer/HookContainer.php at line 135] in /var/www/html/w/includes/Debug/MWDebug.php on line 372
scientific article; zbMATH DE number 3526471 - MaRDI portal

scientific article; zbMATH DE number 3526471

From MaRDI portal

zbMath0336.90057MaRDI QIDQ4103338

G. Zoutendijk

Publication date: 1970


Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.



Related Items

Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update, A New Formula on the Conjugate Gradient Method for Removing Impulse Noise Images, Globally convergent conjugate gradient algorithms, A modified sufficient descent Polak-Ribiére-Polyak type conjugate gradient method for unconstrained optimization problems, A family of hybrid conjugate gradient methods for unconstrained optimization, The proof of the sufficient descent condition of the Wei-Yao-Liu conjugate gradient method under the strong Wolfe-Powell line search, New hybrid conjugate gradient method as a convex combination of LS and FR methods, A modified conjugate gradient method based on a modified secant equation, An improved Hoschek intrinsic parametrization, A convergent hybrid three-term conjugate gradient method with sufficient descent property for unconstrained optimization, A diagonal quasi-Newton updating method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization, Adaptive three-term PRP algorithms without gradient Lipschitz continuity condition for nonconvex functions, A new family of hybrid three-term conjugate gradient methods with applications in image restoration, Global convergence of a modified conjugate gradient method, Two new conjugate gradient methods for unconstrained optimization, A decent three term conjugate gradient method with global convergence properties for large scale unconstrained optimization problems, Two families of self-adjusting spectral hybrid DL conjugate gradient methods and applications in image denoising, A family of hybrid conjugate gradient method with restart procedure for unconstrained optimizations and image restorations, Unnamed Item, Global convergence of the gradient method for functions definable in o-minimal structures, Diagonal approximation of the Hessian by finite differences for unconstrained optimization, A structured Fletcher-Revees spectral conjugate gradient method for unconstrained optimization with application in robotic model, Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions, Two families of hybrid conjugate gradient methods with restart procedures and their applications, New conjugate gradient method for unconstrained optimization, A three-term conjugate gradient algorithm with restart procedure to solve image restoration problems, A modified PRP-type conjugate gradient algorithm with complexity analysis and its application to image restoration problems, A new hybrid conjugate gradient algorithm based on the Newton direction to solve unconstrained optimization problems, Generalized RMIL conjugate gradient method under the strong Wolfe line search with application in image processing, On the convergence rate of Fletcher‐Reeves nonlinear conjugate gradient methods satisfying strong Wolfe conditions: Application to parameter identification in problems governed by general dynamics, A robust BFGS algorithm for unconstrained nonlinear optimization problems, Solving Unconstrained Optimization Problems with Some Three-term Conjugate Gradient Methods, Global convergence properties of the BBB conjugate gradient method, A spectral conjugate gradient method for solving large-scale unconstrained optimization, Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search, Two-step conjugate gradient method for unconstrained optimization, An active set method for solving linearly constrained nonsmooth optimization problems, On search directions for minimization algorithms, Unnamed Item, A modified Hestenes–Stiefel conjugate gradient method with an optimal property, A survey of gradient methods for solving nonlinear optimization, A new conjugate gradient algorithm with cubic Barzilai–Borwein stepsize for unconstrained optimization, A novel value for the parameter in the Dai-Liao-type conjugate gradient method, A modified Hestense–Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton method, A globally convergent gradient-like method based on the Armijo line search, An efficient hybrid conjugate gradient method for unconstrained optimization, Globally convergent inexact generalized Newton method for first-order differentiable optimization problems, Two fundamental convergence theorems for nonlinear conjugate gradient methods and their applications, Two descent hybrid conjugate gradient methods for optimization, Packing different cuboids with rotations and spheres into a cuboid, A new nonmonotone line search technique for unconstrained optimization, A new nonlinear conjugate gradient method with guaranteed global convergence, Conjugate gradient algorithm and fractals, Mathematical model and efficient algorithms for object packing problem, Global convergence of the Polak-Ribière-Polyak conjugate gradient method with an Armijo-type inexact line search for nonconvex unconstrained optimization problems, Two Modified Polak–Ribière–Polyak-Type Nonlinear Conjugate Methods with Sufficient Descent Property, Global convergence property of \(s\)-dependent GFR conjugate gradient method, Unnamed Item, The convergence properties of RMIL+ conjugate gradient method under the strong Wolfe line search, An efficient adaptive scaling parameter for the spectral conjugate gradient method, Some sufficient descent conjugate gradient methods and their global convergence, On the convergence of sequential minimization algorithms, A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization, Convergence of the descent Dai–Yuan conjugate gradient method for unconstrained optimization, Two modified DY conjugate gradient methods for unconstrained optimization problems, Dynamic search trajectory methods for global optimization, Restart procedures for the conjugate gradient method, Some three-term conjugate gradient methods with the new direction structure, A class of globally convergent three-term Dai-Liao conjugate gradient methods, Simultaneous reconstruction of the perfusion coefficient and initial temperature from time-average integral temperature measurements, Behavior of the combination of PRP and HZ methods for unconstrained optimization, Simultaneous identification and reconstruction of the space-dependent reaction coefficient and source term, Least-squares-based three-term conjugate gradient methods, The new spectral conjugate gradient method for large-scale unconstrained optimisation, A modified spectral conjugate gradient method with global convergence, A \(q\)-Polak-Ribière-Polyak conjugate gradient algorithm for unconstrained optimization problems, A new steepest descent method with global convergence properties, Unnamed Item, A modified gradient method for finite element elastoplastic analysis by quadratic programming, Alternative proofs of the convergence properties of the conjugate- gradient method, Global convergence of a modified spectral three-term CG algorithm for nonconvex unconstrained optimization problems, A three-term conjugate gradient method with accelerated subspace quadratic optimization, Stopping criteria for, and strong convergence of, stochastic gradient descent on Bottou-Curtis-Nocedal functions, A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method, Limited memory BFGS method based on a high-order tensor model, A one-parameter class of three-term conjugate gradient methods with an adaptive parameter choice, A nonmonotone hybrid conjugate gradient method for unconstrained optimization, Three modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property, New hyrid conjugate gradient method as a convex combination of HZ and CD methods, An Efficient Hybrid Conjugate Gradient Method for Unconstrained Optimisation, A modified HZ conjugate gradient algorithm without gradient Lipschitz continuous condition for non convex functions, Two modified conjugate gradient methods for unconstrained optimization with applications in image restoration problems, An efficient conjugate gradient-based algorithm for unconstrained optimization and its projection extension to large-scale constrained nonlinear equations with applications in signal recovery and image denoising problems, Two modified nonlinear conjugate gradient methods with disturbance factors for unconstrained optimization, A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization, A new hybrid conjugate gradient method of unconstrained optimization methods, A hybrid conjugate gradient method based on a quadratic relaxation of the Dai–Yuan hybrid conjugate gradient parameter, Mathematical Models of Placement Optimisation: Two- and Three-Dimensional Problems and Applications, Note on an extension of “Davidon” methods to nondifferentiable functions, Globally convergent conjugate gradient algorithms without the Lipschitz condition for nonconvex optimization, From linear to nonlinear iterative methods, An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property, Some modified conjugate gradient methods for unconstrained optimization, New hybrid conjugate gradient method for unconstrained optimization, Spectral method and its application to the conjugate gradient method, New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems, A new family of globally convergent conjugate gradient methods, Global convergence properties of the two new dependent Fletcher-Reeves conjugate gradient methods, Convergence properties of a class of nonlinear conjugate gradient methods, On the convergence of \(s\)-dependent GFR conjugate gradient method for unconstrained optimization, An improved Perry conjugate gradient method with adaptive parameter choice, A globally convergent version of the Polak-Ribière conjugate gradient method, Convergence properties of the dependent PRP conjugate gradient methods, A sufficient descent Dai-Yuan type nonlinear conjugate gradient method for unconstrained optimization problems, A class of one parameter conjugate gradient methods, Strong global convergence of an adaptive nonmonotone memory gradient method, An efficient hybrid conjugate gradient method with the strong Wolfe-Powell line search, A new conjugate gradient algorithm with sufficient descent property for unconstrained optimization, An improved spectral conjugate gradient algorithm for nonconvex unconstrained optimization problems, Modification of nonlinear conjugate gradient method with weak Wolfe-Powell line search, Efficient rank reduction of correlation matrices, A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization, New conjugacy condition and related new conjugate gradient methods for unconstrained optimization, An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix, Convergence and stability of line search methods for unconstrained optimization, A descent nonlinear conjugate gradient method for large-scale unconstrained optimization, A new class of nonlinear conjugate gradient coefficients with global convergence properties, Two modified HS type conjugate gradient methods for unconstrained optimization problems, A subspace conjugate gradient algorithm for large-scale unconstrained optimization, Comments on ``Hybrid conjugate gradient algorithm for unconstrained optimization, Two new Dai-Liao-type conjugate gradient methods for unconstrained optimization problems, A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods, Global convergence of a nonlinear conjugate gradient method, A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties, A variant spectral-type FR conjugate gradient method and its global convergence, A modified conjugate gradient algorithm with cyclic Barzilai-Borwein steplength for unconstrained optimization, Packing congruent hyperspheres into a hypersphere, A new conjugate gradient algorithm for training neural networks based on a modified secant equation, Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property, Global convergence of algorithms with nonmonotone line search strategy in unconstrained optimization, Globally convergent modified Perry's conjugate gradient method, The global convergence of a new mixed conjugate gradient method for unconstrained optimization, Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization, A mixed spectral CD-DY conjugate gradient method, Global convergence of a spectral conjugate gradient method for unconstrained optimization, Global convergence of some modified PRP nonlinear conjugate gradient methods, Two effective hybrid conjugate gradient algorithms based on modified BFGS updates, An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization, A modified conjugacy condition and related nonlinear conjugate gradient method, A novel fractional Tikhonov regularization coupled with an improved super-memory gradient method and application to dynamic force identification problems, A new modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent property and its global convergence, A sufficient descent LS conjugate gradient method for unconstrained optimization problems, A hybrid of DL and WYL nonlinear conjugate gradient methods, Descent line search scheme using Geršgorin circle theorem, A class of gradient unconstrained minimization algorithms with adaptive stepsize, The global convergence of partitioned BFGS on problems with convex decompositions and Lipschitzian gradients, New hybrid conjugate gradient and Broyden-Fletcher-Goldfarb-Shanno conjugate gradient methods, An extension of the Fletcher-Reeves method to linear equality constrained optimization problem, Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search, Further insight into the convergence of the Fletcher-Reeves method, A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches, Global convergence of a modified spectral FR conjugate gradient method, Global convergence of a memory gradient method without line search, CGRS -- an advanced hybrid method for global optimization of continuous functions closely coupling extended random search and conjugate gradient method, Multi-step nonlinear conjugate gradient methods for unconstrained minimization, A limited memory descent Perry conjugate gradient method, Dai-Kou type conjugate gradient methods with a line search only using gradient, Analysis of a self-scaling quasi-Newton method, Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search, Conjugate gradient methods using value of objective function for unconstrained optimization, A conjugate gradient method for unconstrained optimization problems, Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems, A modified CG-DESCENT method for unconstrained optimization, A descent spectral conjugate gradient method for impulse noise removal, A modified nonlinear conjugate gradient method with the Armijo line search and its application, Two new conjugate gradient methods based on modified secant equations, Inelastic analysis of suspension structures by nonlinear programming, A class of modified FR conjugate gradient method and applications to non-negative matrix factorization, A hybridization of the Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods, New accelerated conjugate gradient algorithms as a modification of Dai-Yuan's computational scheme for unconstrained optimization, A note about WYL's conjugate gradient method and its applications, Some three-term conjugate gradient methods with the inexact line search condition, On three-term conjugate gradient algorithms for unconstrained optimization, A new three-term conjugate gradient algorithm for unconstrained optimization, Unified approach to unconstrained minimization via basic matrix factorizations, A conjugate gradient method with descent direction for unconstrained optimization, Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems, A new Liu-Storey type nonlinear conjugate gradient method for unconstrained optimization problems, A spectral three-term Hestenes-Stiefel conjugate gradient method, Line search fixed point algorithms based on nonlinear conjugate gradient directions: application to constrained smooth convex optimization, A globally and quadratically convergent algorithm with efficient implementation for unconstrained optimization, Packing cylinders and rectangular parallelepipeds with distances between them into a given region, Hybrid conjugate gradient method for a convex optimization problem over the fixed-point set of a nonexpansive mapping, The revised DFP algorithm without exact line search, A modified PRP conjugate gradient method, Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization, Convergence properties of the Beale-Powell restart algorithm, A class of nonmonotone conjugate gradient methods for unconstrained optimization, Globally convergent Polak-Ribière-Polyak conjugate gradient methods under a modified Wolfe line search, The relationship between theorems of the alternative, least norm problems, steepest descent directions, and degeneracy: A review