Local convergence analysis for partitioned quasi-Newton updates

From MaRDI portal
Publication:1836466

DOI10.1007/BF01407874zbMath0505.65018OpenAlexW2030405518WikidataQ57389753 ScholiaQ57389753MaRDI QIDQ1836466

Phillipe L. Toint, Andreas Griewank

Publication date: 1982

Published in: Numerische Mathematik (Search for Journal in Brave)

Full work available at URL: https://eudml.org/doc/132808



Related Items

A modified Broyden family algorithm with global convergence under a weak Wolfe-Powell line search for unconstrained nonconvex problems, A parallel quasi-Newton algorithm for unconstrained optimization, Rates of superlinear convergence for classical quasi-Newton methods, Convergence theory for the structured BFGS secant method with an application to nonlinear least squares, The global convergence of the BFGS method with a modified WWP line search for nonconvex functions, Difference Newton-like methods under weak continuity conditions, The global convergence of a modified BFGS method for nonconvex functions, A trust-region-based BFGS method with line search technique for symmetric nonlinear equations, A Broyden Class of Quasi-Newton Methods for Riemannian Optimization, The global convergence of the BFGS method under a modified Yuan-Wei-Lu line search technique, Non-asymptotic superlinear convergence of standard quasi-Newton methods, A novel iterative learning control scheme based on Broyden‐class optimization method, The projection technique for two open problems of unconstrained optimization problems, An SR1/BFGS SQP algorithm for nonconvex nonlinear programs with block-diagonal Hessian matrix, Local andQ-superlinear convergence of a class of collinear scaling algorithms that extends quasi-newton methods with broyden's bounded-⊘ class of updates† ‡, On the stable global convergence of particular quasi-newton-methods, An adaptive projection BFGS method for nonconvex unconstrained optimization problems, On large scale nonlinear network optimization, On the limited memory BFGS method for large scale optimization, A conjugate directions approach to improve the limited-memory BFGS method, Superlinearly convergent exact penalty methods with projected structured secant updates for constrained nonlinear least squares, Convergence acceleration of direct trajectory optimization using novel Hessian calculation methods, The superlinear convergence analysis of a nonmonotone BFGS algorithm on convex objective functions, The global convergence of partitioned BFGS on problems with convex decompositions and Lipschitzian gradients, A partitioned PSB method for partially separable unconstrained optimization problems, A Structured Quasi-Newton Algorithm for Optimizing with Incomplete Hessian Information, A family of the local convergence of the improved secant methods for nonlinear equality constrained optimization subject to bounds on variables, Analysis of sparse quasi-Newton updates with positive definite matrix completion, The global and superlinear convergence of a new nonmonotone MBFGS algorithm on convex objective functions, Block BFGS Methods, Quasi-Newton methods: superlinear convergence without line searches for self-concordant functions, Solving nonlinear systems of equations by means of quasi-neston methods with a nonmonotone stratgy, Optimizing partially separable functions without derivatives, Partitioned quasi-Newton methods for sparse nonlinear equations, New BFGS method for unconstrained optimization problem based on modified Armijo line search, The superlinear convergence of a new quasi-Newton-SQP method for constrained optimization, Global convergence of the partitioned BFGS algorithm for convex partially separable optimization, Stable factorized quasi-Newton methods for nonlinear least-squares problems, A modified BFGS method and its global convergence in nonconvex minimization, New quasi-Newton methods for unconstrained optimization problems, A nonmonotone Broyden method for unconstrained optimization, Convergence analysis of a modified BFGS method on convex minimizations, On the existence of convex decompositions of partially separable functions, On efficient Hessian computation using the edge pushing algorithm in Julia, New results on superlinear convergence of classical quasi-Newton methods, New line search methods for unconstrained optimization, Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search, Recognizing underlying sparsity in optimization, A unified convergence framework for nonmonotone inexact decomposition methods, ve08, A new backtracking inexact BFGS method for symmetric nonlinear equations, Unnamed Item, A two-step superlinearly convergent projected structured BFGS method for constrained nonlinear least squares, Greedy Quasi-Newton Methods with Explicit Superlinear Convergence, The convergence of matrices generated by rank-2 methods from the restricted \(\beta\)-class of Broyden, Variable metric methods for unconstrained optimization and nonlinear least squares, Adjoint-based SQP method with block-wise quasi-Newton Jacobian updates for nonlinear optimal control, Quasi-Newton methods for solving underdetermined nonlinear simultaneous equations, A derivative-free line search and dfp method for symmetric equations with global and superlinear convergence



Cites Work