Reduced Hessian methods as a perturbed Newton-Lagrange method
DOI10.20310/2686-9667-2024-29-145-51-64MaRDI QIDQ6554812
Andreĭ Andreevich Volkov, Alekseĭ Fedirovich Izmailov, Evgeniĭ Ivanovich Uskov
Publication date: 13 June 2024
Published in: Vestnik Rossiĭskikh Universitetov. Matematika (Search for Journal in Brave)
sequential quadratic programmingsuperlinear convergencesecond-order correctionsequality-constrained optimisation problemperturbed Newton-Lagrange method frameworkreduced Hessian of Lagrangian
Numerical mathematical programming methods (65K05) Methods of successive quadratic programming type (90C55)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- An analysis of reduced Hessian methods for constrained optimization
- On the Convergence of Constrained Optimization Methods with Accurate Hessian Information on a Subspace
- An only 2-step Q-superlinear convergence example for some algorithms that use reduced hessian approximations
- On the Local Convergence of a Quasi-Newton Method for the Nonlinear Programming Problem
- An example of irregular convergence in some constrained optimization methods that use the projected hessian
- Projected Hessian Updating Algorithms for Nonlinearly Constrained Optimization
- Newton-Type Methods for Optimization and Variational Problems
- Numerical optimization. Theoretical and practical aspects. Transl. from the French
This page was built for publication: Reduced Hessian methods as a perturbed Newton-Lagrange method
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6554812)