A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
From MaRDI portal
Publication:3831957
DOI10.1137/0726042zbMath0676.65061OpenAlexW2153532571MaRDI QIDQ3831957
Byrd, Richard H., Nocedal, Jorge
Publication date: 1989
Published in: SIAM Journal on Numerical Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/0726042
global convergencequasi-Newton methodssuperlinear convergenceBFGS update formulaBroyden-Fletcher-Goldfarb- Shannoff method
Related Items
Global convergence of the Broyden's class of quasi-Newton methods with nonmonotone linesearch, Forward-backward quasi-Newton methods for nonsmooth optimization problems, A limited memory BFGS algorithm for non-convex minimization with applications in matrix largest eigenvalue problem, A modified Broyden family algorithm with global convergence under a weak Wolfe-Powell line search for unconstrained nonconvex problems, Global convergence properties of the modified BFGS method associating with general line search model, A scaled three-term conjugate gradient method for unconstrained optimization, Convergence and numerical results for a parallel asynchronous quasi- Newton method, On \(q\)-BFGS algorithm for unconstrained optimization problems, A parallel quasi-Newton algorithm for unconstrained optimization, Limited-memory BFGS with displacement aggregation, Rates of superlinear convergence for classical quasi-Newton methods, A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications, An analysis of reduced Hessian methods for constrained optimization, Global convergence of the non-quasi-Newton method for unconstrained optimization problems, Global convergence properties of the two new dependent Fletcher-Reeves conjugate gradient methods, Modifying the BFGS method, A quasi-Newton method with Wolfe line searches for multiobjective optimization, The global convergence of the BFGS method with a modified WWP line search for nonconvex functions, Nonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty model, Parallel quasi-Newton methods for unconstrained optimization, Some convergence properties of descent methods, Global convergence of a nonmonotone Broyden family method for nonconvex unconstrained minimization, A modified nonmonotone BFGS algorithm for unconstrained optimization, The global convergence of a modified BFGS method for nonconvex functions, Nonmonotone spectral gradient method based on memoryless symmetric rank-one update for large-scale unconstrained optimization, Global convergence of a modified Broyden family method for nonconvex functions, On the formulation and theory of the Newton interior-point method for nonlinear programming, Global convergence properties of two modified BFGS-type methods, A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization, Using function-values in multi-step quasi-Newton methods, Spectral scaling BFGS method, A trust-region-based BFGS method with line search technique for symmetric nonlinear equations, Two penalized mixed-integer nonlinear programming approaches to tackle multicollinearity and outliers effects in linear regression models, A double parameter scaled BFGS method for unconstrained optimization, Global convergence of a modified limited memory BFGS method for non-convex minimization, New cautious BFGS algorithm based on modified Armijo-type line search, Extra multistep BFGS updates in quasi-Newton methods, A new descent method for symmetric non-monotone variational inequalities with application to eigenvalue complementarity problems, The global convergence of the BFGS method under a modified Yuan-Wei-Lu line search technique, A cautious BFGS update for reduced Hessian SQP, A Riemannian view on shape optimization, Gradient method with multiple damping for large-scale unconstrained optimization, New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method, A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems, The projection technique for two open problems of unconstrained optimization problems, The convergence of a new modified BFGS method without line searches for unconstrained optimization or complexity systems, Secant penalized BFGS: a noise robust quasi-Newton method via penalizing the secant condition, On the limited memory BFGS method for large scale optimization, Multi-step spectral gradient methods with modified weak secant relation for large scale unconstrained optimization, Limited memory BFGS method with backtracking for symmetric nonlinear equations, The hybrid BFGS-CG method in solving unconstrained optimization problems, A limited memory BFGS method for solving large-scale symmetric nonlinear equations, The superlinear convergence analysis of a nonmonotone BFGS algorithm on convex objective functions, A class of gradient unconstrained minimization algorithms with adaptive stepsize, The global convergence of partitioned BFGS on problems with convex decompositions and Lipschitzian gradients, New hybrid conjugate gradient and Broyden-Fletcher-Goldfarb-Shanno conjugate gradient methods, On the Newton interior-point method for nonlinear programming problems, Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search, Further insight into the convergence of the Fletcher-Reeves method, A partitioned PSB method for partially separable unconstrained optimization problems, Analysis of sparse quasi-Newton updates with positive definite matrix completion, A modified BFGS method and its superlinear convergence in nonconvex minimization with general line search rule, Global convergence of the nonmonotone MBFGS method for nonconvex unconstrained minimization, An adaptive scaled BFGS method for unconstrained optimization, A new modified BFGS method for unconstrained optimization problems, Damped techniques for the limited memory BFGS method for large-scale optimization, A regularized limited memory BFGS method for nonconvex unconstrained minimization, An inexact proximal regularization method for unconstrained optimization, Analysis of a self-scaling quasi-Newton method, Accumulative approach in multistep diagonal gradient-type method for large-scale unconstrained optimization, A reduced Hessian SQP method for inequality constrained optimization, Convergence analysis of an improved BFGS method and its application in the Muskingum model, Convergence property of a class of variable metric methods., A stochastic quasi-Newton method for simulation response optimization, A new globalization technique for nonlinear conjugate gradient methods for nonconvex minimization, New quasi-Newton methods for unconstrained optimization problems, Convergence analysis of a modified BFGS method on convex minimizations, Preconditioned conjugate gradient algorithms for nonconvex problems with box constraints, A variation of Broyden class methods using Householder adaptive transforms, Using nonlinear functions to approximate a new quasi-Newton method for unconstrained optimization problems, New results on superlinear convergence of classical quasi-Newton methods, New line search methods for unconstrained optimization, Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search, A limited memory BFGS-type method for large-scale unconstrained optimization, A family of variable metric proximal methods, Two modified Dai-Yuan nonlinear conjugate gradient methods, A limited memory \(q\)-BFGS algorithm for unconstrained optimization problems, Gauss-Newton-based BFGS method with filter for unconstrained minimization, A descent cautious BFGS method for computing US-eigenvalues of symmetric complex tensors, Adaptive scaling damped BFGS method without gradient Lipschitz continuity, A globally convergent BFGS method with nonmonotone line search for non-convex minimization, A new backtracking inexact BFGS method for symmetric nonlinear equations, Łojasiewicz gradient inequalities for polynomial functions and some applications, A globally convergent BFGS method for symmetric nonlinear equations, Globally convergent BFGS method for nonsmooth convex optimization, A new type of quasi-Newton updating formulas based on the new quasi-Newton equation, Applications of semidefinite programming, A modified BFGS type quasi-Newton method with line search for symmetric nonlinear equations problems, On the use of consistent approximations in the solution of semi-infinite optimization and optimal control problems, An adaptive sizing BFGS method for unconstrained optimization, Improved high-dimensional regression models with matrix approximations applied to the comparative case studies with support vector machines, A Riemannian BFGS Method for Nonconvex Optimization Problems, A modified conjugate gradient method based on a modified secant equation, Convergence of the BFGS-SQP Method for Degenerate Problems, A diagonal quasi-Newton updating method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization, Damped techniques for enforcing convergence of quasi-Newton methods, A Modified Non-Monotone BFGS Method for Non-Convex Unconstrained Optimization, Forward-Backward Envelope for the Sum of Two Nonconvex Functions: Further Properties and Nonmonotone Linesearch Algorithms, Towards explicit superlinear convergence rate for SR1, Proximal variable metric method with spectral diagonal update for large scale sparse optimization, A Reproducing Kernel Hilbert Space Approach to Functional Calibration of Computer Models, Modified globally convergent Polak-Ribière-Polyak conjugate gradient methods with self-correcting property for large-scale unconstrained optimization, Eigenvalue analyses on the memoryless Davidon-Fletcher-Powell method based on a spectral secant equation, A NONMONOTONE ADMM-BASED DIAGONAL QUASI-NEWTON UPDATE WITH APPLICATION TO THE COMPRESSIVE SENSING PROBLEM, Competitive secant (BFGS) methods based on modified secant relations for unconstrained optimization, An adaptive projection BFGS method for nonconvex unconstrained optimization problems, A Riemannian BFGS Method Without Differentiated Retraction for Nonconvex Optimization Problems, Unnamed Item, Convergence analysis of the self-dual optimally conditioned ssvm method of oren-spedicato, A new class of quasi-Newton updating formulas, Global convergece of the bfgs algorithm with nonmonotone linesearch∗∗this work is supported by national natural science foundation$ef:, A Structured Quasi-Newton Algorithm for Optimizing with Incomplete Hessian Information, A new accelerated diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization, A globally convergent BFGS method for pseudo-monotone variational inequality problems, New BFGS method for unconstrained optimization problem based on modified Armijo line search, A BFGS algorithm for solving symmetric nonlinear equations, A modified BFGS method and its global convergence in nonconvex minimization, Unnamed Item, Nonsmooth equation based BFGS method for solving KKT systems in mathematical programming, A nonmonotone Broyden method for unconstrained optimization, Convergence analysis of the Levenberg–Marquardt method, A globally convergent BFGS method for nonlinear monotone equations without any merit functions, A norm descent BFGS method for solving KKT systems of symmetric variational inequality problems, A practical update criterion for SQP method, A globally convergent BFGS method for nonconvex minimization without line searches, Analysis of the BFGS Method with Errors, MATRIX ANALYSES ON THE DAI–LIAO CONJUGATE GRADIENT METHOD, Linear Convergence of Descent Methods for the Unconstrained Minimization of Restricted Strongly Convex Functions, Diagonal quasi-Newton method via variational principle under generalized Frobenius norm, Maximum Entropy Derivation of Quasi-Newton Methods, An Adaptive Smoothing Method for Continuous Minimax Problems, Unnamed Item, Greedy Quasi-Newton Methods with Explicit Superlinear Convergence, A Modified Hestenes-Stiefel Conjugate Gradient Algorithm for Large-Scale Optimization, An improved quasi-newton method for unconstrained optimization, Extra updates for the bfgs method∗, A derivative-free line search and dfp method for symmetric equations with global and superlinear convergence, A CLASS OF MODIFIED BFGS METHODS WITH FUNCTION VALUE INFORMATION FOR UNCONSTRAINED OPTIMIZATION, A Noise-Tolerant Quasi-Newton Algorithm for Unconstrained Optimization