Multilevel least-change Newton-like methods for equality constrained optimization problems
From MaRDI portal
Publication:3028736
DOI10.1007/BF02591686zbMath0625.90076OpenAlexW2028913091MaRDI QIDQ3028736
Publication date: 1987
Published in: Mathematical Programming (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf02591686
inverse Hessianq-superlinear convergenceequality constrained smooth optimizationmultilevel least-change updatesprojections of fragments of the augmented Hessian matrix
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Diagonalized multiplier methods and quasi-Newton methods for constrained optimization
- Multiplier and gradient methods
- Convergence Theorems for Least-Change Secant Update Methods
- Local and superlinear convergence for truncated iterated projections methods
- Orthogonal Projections on Convex Sets for Newton-Like Methods
- On the Local Convergence of Quasi-Newton Methods for Constrained Optimization
- An Ideal Penalty Function for Constrained Optimization
- Dual Variable Metric Algorithms for Constrained Optimization
- Superlinearly convergent variable metric algorithms for general nonlinear programming problems
- On Sparse and Symmetric Matrix Updating Subject to a Linear Equation
- On the Local and Superlinear Convergence of Quasi-Newton Methods
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- Variations on Variable-Metric Methods
This page was built for publication: Multilevel least-change Newton-like methods for equality constrained optimization problems