On lower iteration complexity bounds for the convex concave saddle point problems
From MaRDI portal
Publication:2149573
DOI10.1007/s10107-021-01660-zzbMath1494.90127arXiv1912.07481OpenAlexW3166764897MaRDI QIDQ2149573
Mingyi Hong, Junyu Zhang, Shu-Zhong Zhang
Publication date: 29 June 2022
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1912.07481
Related Items
New First-Order Algorithms for Stochastic Variational Inequalities, Generalized mirror prox algorithm for monotone variational inequalities: Universality and inexact oracle, Transformed primal-dual methods for nonlinear saddle point systems, No-regret dynamics in the Fenchel game: a unified framework for algorithmic convex optimization, Robust Accelerated Primal-Dual Methods for Computing Saddle Points, Optimality Conditions for Nonsmooth Nonconvex-Nonconcave Min-Max Problems and Generative Adversarial Networks, Smooth monotone stochastic variational inequalities and saddle point problems: a survey
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On the ergodic convergence rates of a first-order primal-dual algorithm
- Solving strongly monotone variational and quasi-variational inequalities
- Lectures on convex optimization
- Dual extrapolation and its applications to solving variational inequalities and related problems
- A note on a globally convergent Newton method for solving monotone variational inequalities
- Information-based complexity of linear operator equations
- First-order algorithms for convex optimization with nonseparable objective and coupled constraints
- A first-order primal-dual algorithm for convex problems with applications to imaging
- Lower bounds for finding stationary points I
- Lower complexity bounds of first-order methods for convex-concave bilinear saddle-point problems
- Lower bounds for finding stationary points II: first-order methods
- Oracle complexity of second-order methods for smooth convex optimization
- A globally convergent Newton method for solving strongly monotone variational inequalities
- DSCOVR: Randomized Primal-Dual Block Coordinate Algorithms for Asynchronous Distributed Optimization
- Prox-Method with Rate of Convergence O(1/t) for Variational Inequalities with Lipschitz Continuous Monotone Operators and Smooth Convex-Concave Saddle Point Problems
- Accelerated First-Order Primal-Dual Proximal Methods for Linearly Constrained Composite Convex Programming
- Convergence Rate of $\mathcal{O}(1/k)$ for Optimistic Gradient and Extragradient Methods in Smooth Convex-Concave Saddle Point Problems
- Solving variational inequalities with Stochastic Mirror-Prox algorithm
- An Accelerated Linearized Alternating Direction Method of Multipliers
- Algorithmic Game Theory