On the convergence of projected gradient processes to singular critical points
From MaRDI portal
Publication:1821692
DOI10.1007/BF00939081zbMath0616.90060OpenAlexW2090243370MaRDI QIDQ1821692
Publication date: 1987
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf00939081
asymptotic stabilityprojected gradient methodsconvex feasible setssingular minimizersconvergence rate theoremsnonconvex objective functions
Nonlinear programming (90C30) Sensitivity, stability, parametric optimization (90C31) Numerical methods based on nonlinear programming (49M37)
Related Items
Some recent advances in projection-type methods for variational inequalities, A projected Newton method in a Cartesian product of balls, A projected Newton method for minimization problems with nonlinear inequality constraints, Proximal methods avoid active strict saddles of weakly convex functions, LMBOPT: a limited memory method for bound-constrained optimization, On the rate of convergence of projected Barzilai–Borwein methods, Variable metric gradient projection processes in convex feasible sets defined by nonlinear inequalities, Partial Smoothness and Constant Rank, Active-Set Identification with Complexity Guarantees of an Almost Cyclic 2-Coordinate Descent Method with Armijo Line Search, Algorithms for bound constrained quadratic programming problems, An Extension of the Projected Gradient Method to a Banach Space Setting with Application in Structural Topology Optimization, Infeasibility Detection with Primal-Dual Hybrid Gradient for Large-Scale Linear Programming, Linear convergence analysis of the use of gradient projection methods on total variation problems, Global convergence of a modified gradient projection method for convex constrained problems, Modified active set projected spectral gradient method for bound constrained optimization, Convergence properties of trust region methods for linear and convex constraints, Optimality, identifiability, and sensitivity, Solution of projection problems over polytopes, A projection and contraction method for a class of linear complementarity problems and its application in convex quadratic programming, On Regularization and Active-set Methods with Complexity for Constrained Optimization, Minimum principle sufficiency, Ergodic convergence in subgradient optimization, Convergence properties of nonmonotone spectral projected gradient methods, The chain rule for VU-decompositions of nonsmooth functions, A potential reduction algorithm for linearly constrained convex programming, Generic Minimizing Behavior in Semialgebraic Optimization, Column Generation Algorithms for Nonlinear Optimization, I: Convergence Analysis, Convergence analysis of a projection algorithm for variational inequality problems, A superlinearly convergent SSDP algorithm for nonlinear semidefinite programming, On the convergence properties of scaled gradient projection methods with non-monotone Armijo-like line searches, Active‐Set Newton Methods and Partial Smoothness, Error bounds and convergence analysis of feasible descent methods: A general approach, Finite convergence of algorithms for nonlinear programs and variational inequalities, Interior point methods for optimal control of discrete time systems
Cites Work
- A class of superlinearly convergent projection algorithms with relaxed stepsizes
- Asymptotic decay rates from the growth properties of Lyapunov functions near singular attractors
- Newton-Goldstein convergence rates for convex constrained minimization problems with singular solutions
- Two-Metric Projection Methods for Constrained Optimization
- Newton’s Method and the Goldstein Step-Length Rule for Constrained Minimization Problems
- On the Goldstein-Levitin-Polyak gradient projection method
- Projected Newton Methods for Optimization Problems with Simple Constraints
- Convex programming in Hilbert space
- On Steepest Descent