A convergent hierarchy of SDP relaxations for a class of hard robust global polynomial optimization problems
DOI10.1016/j.orl.2017.04.005zbMath1409.90182OpenAlexW2611961450WikidataQ59241415 ScholiaQ59241415MaRDI QIDQ1728251
Guoyin Li, Vaithilingam Jeyakumar, Nguyen Huy Chieu
Publication date: 22 February 2019
Published in: Operations Research Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.orl.2017.04.005
nonconvex optimizationrobust optimizationglobal polynomial optimizationoptimization under data uncertaintysemi-definite programming relaxations
Semidefinite programming (90C22) Nonlinear programming (90C30) Sensitivity, stability, parametric optimization (90C31)
Related Items (3)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Semidefinite programming relaxation methods for global optimization problems with sparse polynomials and unbounded semialgebraic feasible sets
- On polynomial optimization over non-compact semi-algebraic sets
- An extension of sums of squares relaxations to polynomial optimization problems over symmetric cones
- Robust convex quadratically constrained programs
- Convergence of the Lasserre hierarchy of SDP relaxations for convex polynomial programs without compactness
- Exact SDP relaxations for classes of nonlinear semidefinite programming problems
- Robust SOS-convex polynomial optimization problems: exact SDP relaxations
- Matrix sum-of-squares relaxations for robust semi-definite programs
- Robust global optimization with polynomials
- Exploiting Sparsity in SDP Relaxation of Polynomial Optimization Problems
- Algorithm 920
- Constructing Uncertainty Sets for Robust Linear Optimization
- Theory and Applications of Robust Optimization
- Pre- and Post-Processing Sum-of-Squares Programs in Practice
- Convergent Relaxations of Polynomial Matrix Inequalities and Static Output Feedback
This page was built for publication: A convergent hierarchy of SDP relaxations for a class of hard robust global polynomial optimization problems