Computing the Best Approximation over the Intersection of a Polyhedral Set and the Doubly Nonnegative Cone
From MaRDI portal
Publication:5242932
DOI10.1137/18M1175562zbMath1431.90109arXiv1803.06566WikidataQ126856101 ScholiaQ126856101MaRDI QIDQ5242932
Defeng Sun, Kim-Chuan Toh, Ying Cui
Publication date: 8 November 2019
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1803.06566
Semidefinite programming (90C22) Convex programming (90C25) Large-scale problems in mathematical programming (90C06)
Related Items
MultiComposite Nonconvex Optimization for Training Deep Neural Networks, On Degenerate Doubly Nonnegative Projection Problems
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Inexact coordinate descent: complexity and preconditioning
- First-order methods of smooth convex optimization with inexact oracle
- An optimal method for stochastic composite optimization
- On the complexity analysis of randomized block-coordinate descent methods
- SDPNAL+: a majorized semismooth Newton-CG augmented Lagrangian method for semidefinite programming with nonnegative constraints
- Iteration complexity analysis of block coordinate descent methods
- A modified alternating direction method for convex quadratically constrained quadratic semidefinite programs
- Robust matrix completion
- Stochastic intermediate gradient method for convex problems with stochastic inexact oracle
- A coordinate gradient descent method for nonsmooth separable minimization
- Constrained interpolation and smoothing
- Solution of monotone complementarity problems with locally Lipschitzian functions
- Strong conical hull intersection property, bounded linear regularity, Jameson's property \((G)\), and error bounds in convex optimization
- Quadratic convergence of Newton's method for convex interpolation and smoothing
- A globally convergent Newton method for convex \(SC^ 1\) minimization problems
- Randomness and permutations in coordinate descent methods
- Coordinate descent algorithms
- A nonsmooth version of Newton's method
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- An Efficient Inexact ABCD Method for Least Squares Semidefinite Programming
- Optimization in High Dimensions via Accelerated, Parallel, and Proximal Coordinate Descent
- Accelerated and Inexact Forward-Backward Algorithms
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Efficiency of the Accelerated Coordinate Descent Method on Structured Optimization Problems
- A Newton-CG Augmented Lagrangian Method for Semidefinite Programming
- Smooth Optimization with Approximate Gradient
- A Quadratically Convergent Newton Method for Computing the Nearest Correlation Matrix
- Accelerated, Parallel, and Proximal Coordinate Descent
- Calibrating Least Squares Semidefinite Programming with Equality and Inequality Constraints
- An Algorithm for Constrained Interpolation
- Duality and well-posedness in convex interpolation∗)
- Bi-CGSTAB: A Fast and Smoothly Converging Variant of Bi-CG for the Solution of Nonsymmetric Linear Systems
- Semismooth and Semiconvex Functions in Constrained Optimization
- Variational Analysis
- A Dual Approach to Semidefinite Least-Squares Problems
- On Projection Algorithms for Solving Convex Feasibility Problems
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- An Inexact Accelerated Proximal Gradient Method for Large Scale Linearly Constrained Convex SDP
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization I: A Generic Algorithmic Framework
- A remark on accelerated block coordinate descent for computing the proximity operators of a sum of convex functions
- Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method
- On the Convergence of Block Coordinate Descent Type Methods
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization, II: Shrinking Procedures and Optimal Algorithms
- Least-Squares Covariance Matrix Adjustment
- Semismooth Matrix-Valued Functions
- Convergence of Newton's method for convex best interpolation
- Convergence of a block coordinate descent method for nondifferentiable minimization