A feasible SQP-GS algorithm for nonconvex, nonsmooth constrained optimization (Q393748)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: A feasible SQP-GS algorithm for nonconvex, nonsmooth constrained optimization |
scientific article; zbMATH DE number 6249854
| Language | Label | Description | Also known as |
|---|---|---|---|
| English | A feasible SQP-GS algorithm for nonconvex, nonsmooth constrained optimization |
scientific article; zbMATH DE number 6249854 |
Statements
A feasible SQP-GS algorithm for nonconvex, nonsmooth constrained optimization (English)
0 references
24 January 2014
0 references
An algorithm combining the gradient sampling (GS) technique with the sequential quadratic programming (SQP) method for nonconvex, nonsmooth constrained optimization problems with locally Lipschitz and continuously differentiable functions is presented. The proposed algorithm generates a sequence of feasible iterations and guarantees that the objective function is monotonically decreasing. It is an alternative of the penalty function based SQP-GS algorithm proposed by \textit{F. E. Curtis} and \textit{M. L. Overton} [SIAM J. Optim. 22, No. 2, 474--500 (2012; Zbl 1246.49031)]. Instead of the penalty function, serving as a merit function to generate the next iterate, the authors make use of the improvement function, which is one of the most effective tools to handle constraints and plays a significant role in global convergence analysis.
0 references
constrained optimization
0 references
nonsmooth optimization
0 references
nonconvex optimization
0 references
nonlinear programming
0 references
gradient sampling
0 references
sequential quadratic programming
0 references
feasible algorithm
0 references
global convergence
0 references
Clarke subdifferential
0 references
numerical experiments
0 references
0 references
0 references
0 references
0 references
0 references
0 references
0 references
0 references