A class of differential descent methods for constrained optimization
DOI10.1016/0022-247X(81)90012-3zbMath0446.90079MaRDI QIDQ1146118
Publication date: 1981
Published in: Journal of Mathematical Analysis and Applications (Search for Journal in Brave)
constrained optimizationlinear approximationnonlinear equality constraintsasymptotically quadratic convergence rateclass of algorithmscontinuously differentiable matrixcurvilinear search pathsdifferential descent methodsfirst-order Kuhn-Tucker optimality conditionsinitial- valued system of differential equationsminimizing a nonlinear function
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Rate of convergence, degree of approximation (41A25)
Related Items (5)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Newton-type curvilinear search method for optimization
- Differential gradient methods
- A Newton-type curvilinear search method for constrained optimization
- Handbook series linear algebra. Linear least squares solutions by Householder transformations
- An alternate implementation of Goldfarb's minimization algorithm
- Quasi-Newton Methods for Unconstrained Optimization
This page was built for publication: A class of differential descent methods for constrained optimization