Lower complexity bounds of first-order methods for convex-concave bilinear saddle-point problems
DOI10.1007/s10107-019-01420-0zbMath1458.90516arXiv1808.02901OpenAlexW2965299328WikidataQ127404094 ScholiaQ127404094MaRDI QIDQ2220653
Publication date: 25 January 2021
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1808.02901
convex optimizationsaddle point problemsinformation-based complexitylower complexity boundfirst-order methods
Analysis of algorithms and problem complexity (68Q25) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Abstract computational complexity for mathematical programming problems (90C60) Numerical methods based on nonlinear programming (49M37)
Related Items (29)
Cites Work
- Unnamed Item
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Gradient sliding for composite optimization
- Gradient methods for minimizing composite functions
- A primal-dual splitting method for convex optimization involving Lipschitzian, proximable and linear composite terms
- First-order methods of smooth convex optimization with inexact oracle
- On lower complexity bounds for large-scale smooth convex optimization
- On optimality of Krylov's information when solving linear operator equations
- Information-based complexity of linear operator equations
- Introductory lectures on convex optimization. A basic course.
- Accelerated schemes for a class of variational inequalities
- First-order algorithms for convex optimization with nonseparable objective and coupled constraints
- Accelerated primal-dual proximal block coordinate updating methods for constrained convex optimization
- An optimal randomized incremental gradient method
- A new primal-dual algorithm for minimizing the sum of three functions with a linear operator
- A first-order primal-dual algorithm for convex problems with applications to imaging
- Lower bounds for finding stationary points I
- Randomized primal-dual proximal block coordinate updates
- Conditional Gradient Sliding for Convex Optimization
- On the $O(1/n)$ Convergence Rate of the Douglas–Rachford Alternating Direction Method
- Deterministic and stochastic primal-dual subgradient algorithms for uniformly convex minimization
- A General Framework for a Class of First Order Primal-Dual Algorithms for Convex Optimization in Imaging Science
- Complexity of Variants of Tseng's Modified F-B Splitting and Korpelevich's Methods for Hemivariational Inequalities with Applications to Saddle-point and Convex Optimization Problems
- An Accelerated HPE-Type Algorithm for a Class of Composite Convex-Concave Saddle-Point Problems
- Robust Stochastic Approximation Approach to Stochastic Programming
- Prox-Method with Rate of Convergence O(1/t) for Variational Inequalities with Lipschitz Continuous Monotone Operators and Smooth Convex-Concave Saddle Point Problems
- Accelerated First-Order Primal-Dual Proximal Methods for Linearly Constrained Composite Convex Programming
- Fast Alternating Direction Optimization Methods
- Optimal Primal-Dual Methods for a Class of Saddle Point Problems
- An Accelerated Linearized Alternating Direction Method of Multipliers
- Iteration-Complexity of Block-Decomposition Algorithms and the Alternating Direction Method of Multipliers
- Iteration-complexity of first-order augmented Lagrangian methods for convex programming
This page was built for publication: Lower complexity bounds of first-order methods for convex-concave bilinear saddle-point problems