Superlinearly convergent approximate Newton methods for LC\(^ 1\) optimization problems
From MaRDI portal
Publication:1332309
DOI10.1007/BF01582577zbMath0820.90102OpenAlexW2042725681MaRDI QIDQ1332309
Publication date: 10 October 1994
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf01582577
Nonlinear programming (90C30) Computational methods for problems pertaining to operations research and mathematical programming (90-08)
Related Items
Some improved convergence results for variational inequality problems, ON SOME NCP-FUNCTIONS BASED ON THE GENERALIZED FISCHER–BURMEISTER FUNCTION, An ODE-based trust region method for unconstrained optimization problems, A dual active-set proximal Newton algorithm for sparse approximation of correlation matrices, An SQP algorithm for extended linear-quadratic problems in stochastic programming, A globally convergent Newton method for convex \(SC^ 1\) minimization problems, A preconditioning proximal Newton method for nondifferentiable convex optimization, Newton's method for quadratic stochastic programs with recourse, Minimization of \(SC^ 1\) functions and the Maratos effect, The semismooth-related properties of a merit function and a descent method for the nonlinear complementarity problem, A superlinearly convergent ODE-type trust region algorithm for nonsmooth nonlinear equations, Some convergence properties of descent methods, Local feasible QP-free algorithms for the constrained minimization of SC\(^1\) functions, A parallel inexact Newton method for stochastic programs with recourse, Approximate Newton methods for nonsmooth equations, New version of the Newton method for nonsmooth equations, The Josephy-Newton method for semismooth generalized equations and semismooth SQP for optimization, An SQP-type method and its application in stochastic programs, A note on upper Lipschitz stability, error bounds, and critical multipliers for Lipschitz-continuous KKT systems, A perturbed version of an inexact generalized Newton method for solving nonsmooth equations, A filter-trust-region method for LC 1 unconstrained optimization and its global convergence, Stochastic ultimate load analysis:models and solution methods1, Differentiability properties of functions that are \(\ell \)-stable at a point, On second-order conditions in unconstrained optimization, Superlinearly convergent affine scaling interior trust-region method for linear constrained \(LC^{1}\) minimization, A globally and superlinearly convergent trust region method for \(LC^1\) optimization problems, Differentiability and semismoothness properties of integral functions and their applications, A finite newton method for classification, A Parametric Newton Method for Optimization Problems in Hilbert Spaces, An accelerated active-set algorithm for a quadratic semidefinite program with general constraints, An Affine Scaling Interior Point Filter Line-Search Algorithm for Linear Inequality Constrained Minimization, Local convergence of the method of multipliers for variational and optimization problems under the noncriticality assumption, The \(SC^1\) 1property of an expected residual function arising from stochastic complementarity problems, New constrained optimization reformulation of complementarity problems, Convergence analysis of a proximal newton method1, Fréchet approach in second-order optimization, Semismooth SQP method for equality-constrained optimization problems with an application to the lifted reformulation of mathematical programs with complementarity constraints, Minimal approximate Hessians for continuously Gâteaux differentiable functions, A note on second-order optimality conditions, On Lipschitz behaviour of some generalized derivatives, Globally and superlinearly convergent trust-region algorithm for convex \(SC^ 1\)-minimization problems and its application to stochastic programs, Newton's method and quasi-Newton-SQP method for general \(\text{LC}^1\) constrained optimization, On preconditioned Uzawa methods and SOR methods for saddle-point problems, Unnamed Item, Newton-type methods for stochastic programming., On relations and applications of generalized second-order directional derivatives, Some Quadrature-Based Versions of the Generalized Newton Method for Solving Unconstrained Optimization Problems
Cites Work
- Unnamed Item
- Unnamed Item
- An interior point algorithm of O\((\sqrt m| \ln\varepsilon |)\) iterations for \(C^ 1\)-convex programming
- Generalized Hessian matrix and second-order optimality conditions for problems with \(C^{1,1}\) data
- Computational schemes for large-scale problems in extended linear- quadratic programming
- A unified approach to global convergence of trust region methods for nonsmooth optimization
- An SQP algorithm for extended linear-quadratic problems in stochastic programming
- A nonsmooth version of Newton's method
- Semismoothness and decomposition of maximal normal operators
- Nonsmooth Equations: Motivation and Algorithms
- Generalized Linear-Quadratic Problems of Deterministic and Stochastic Optimal Control in Discrete Time
- Optimization and nonsmooth analysis
- On the Extension of Constrained Optimization Algorithms from Differentiable to Nondifferentiable Problems
- Minimization of Locally Lipschitzian Functions
- Perturbed Kuhn-Tucker points and rates of convergence for a class of nonlinear-programming algorithms
- Superlinearly convergent quasi-newton algorithms for nonlinearly constrained optimization problems
- Superlinearly convergent variable metric algorithms for general nonlinear programming problems
- Semismooth and Semiconvex Functions in Constrained Optimization
- Primal-Dual Projected Gradient Algorithms for Extended Linear-Quadratic Programming
- Convergence Analysis of Some Algorithms for Solving Nonsmooth Equations
- Penalty function versus non-penalty function methods for constrained nonlinear programming problems
- A quadratically-convergent algorithm for general nonlinear programming problems