On the complexity of approximating a KKT point of quadratic programming

From MaRDI portal
Publication:1380927

DOI10.1007/BF01581726zbMath0894.90117OpenAlexW2063676335MaRDI QIDQ1380927

Yinyu Ye

Publication date: 7 September 1998

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/bf01581726



Related Items

A new branch-and-cut algorithm for non-convex quadratic programming via alternative direction method and semidefinite relaxation, A second-order optimality condition with first- and second-order complementarity associated with global convergence of algorithms, High-Dimensional Learning Under Approximate Sparsity with Applications to Nonsmooth Estimation and Regularized Neural Networks, Accelerated Methods for NonConvex Optimization, A New Global Optimization Scheme for Quadratic Programs with Low-Rank Nonconvexity, An L p Norm Relaxation Approach to Positive Influence Maximization in Social Network under the Deterministic Linear Threshold Model, Newton-KKT interior-point methods for indefinite quadratic programming, An improved algorithm for the \(L_2-L_p\) minimization problem, Folded concave penalized sparse linear regression: sparsity, statistical performance, and algorithmic theory for local solutions, Some theoretical limitations of second-order algorithms for smooth constrained optimization, A new global algorithm for factor-risk-constrained mean-variance portfolio selection, Effective algorithms for optimal portfolio deleveraging problem with cross impact, Linear-step solvability of some folded concave and singly-parametric sparse optimization problems, A note on the complexity of \(L _{p }\) minimization, A FPTAS for computing a symmetric leontief competitive economy equilibrium, New global algorithms for quadratic programming with a few negative eigenvalues based on alternative direction method and convex relaxation, Complexity analysis of interior point algorithms for non-Lipschitz and nonconvex minimization, Copositive Relaxation Beats Lagrangian Dual Bounds in Quadratically and Linearly Constrained Quadratic Optimization Problems, Sample average approximation with sparsity-inducing penalty for high-dimensional stochastic programming, Optimality condition and complexity analysis for linearly-constrained optimization without differentiability on the boundary



Cites Work