Accelerated minimax algorithms flock together
From MaRDI portal
Publication:6663115
DOI10.1137/22M1504597MaRDI QIDQ6663115
Could not fetch data.
Publication date: 14 January 2025
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
minimax optimizationrates of convergenceaccelerationmonotone operatorsfirst-order methodsgradient norm reduction
Could not fetch data.
Cites Work
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Optimized first-order methods for smooth convex minimization
- On the ergodic convergence rates of a first-order primal-dual algorithm
- A primal-dual splitting method for convex optimization involving Lipschitzian, proximable and linear composite terms
- Low-cost modification of Korpelevich's methods for monotone equilibrium problems
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- The subgradient extragradient method for solving variational inequalities in Hilbert space
- Solving strongly monotone variational and quasi-variational inequalities
- On the convergence rate of the Halpern-iteration
- Dual extrapolation and its applications to solving variational inequalities and related problems
- Monotone (nonlinear) operators in Hilbert space
- A modification of the Arrow-Hurwicz method for search of saddle points
- A hybrid approximate extragradient-proximal point algorithm using the enlargement of a maximal monotone operator
- Accelerated schemes for a class of variational inequalities
- A new primal-dual algorithm for minimizing the sum of three functions with a linear operator
- New extragradient-type methods for general variational inequalities.
- A first-order primal-dual algorithm for convex problems with applications to imaging
- A splitting algorithm for dual monotone inclusions involving cocoercive operators
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions
- On lower iteration complexity bounds for the convex concave saddle point problems
- Golden ratio algorithms for variational inequalities
- Accelerated methods for saddle-point problem
- Lower complexity bounds of first-order methods for convex-concave bilinear saddle-point problems
- Accelerated proximal point method for maximally monotone operators
- An extragradient algorithm for monotone variational inequalities
- Shadow Douglas-Rachford splitting for monotone inclusions
- Near-optimal no-regret algorithms for zero-sum games
- Proximal Splitting Methods in Signal Processing
- Convergence Analysis of Primal-Dual Algorithms for a Saddle-Point Problem: From Contraction Perspective
- On the Complexity of the Hybrid Proximal Extragradient Method for the Iterates and the Ergodic Mean
- Strong convergence of subgradient extragradient methods for the variational inequality problem in Hilbert space
- Complexity of Variants of Tseng's Modified F-B Splitting and Korpelevich's Methods for Hemivariational Inequalities with Applications to Saddle-point and Convex Optimization Problems
- On the Numerical Solution of Heat Conduction Problems in Two and Three Space Variables
- An Accelerated HPE-Type Algorithm for a Class of Composite Convex-Concave Saddle-Point Problems
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- New Proximal Point Algorithms for Convex Minimization
- Monotone Operators and the Proximal Point Algorithm
- An accelerated non-Euclidean hybrid proximal extragradient-type algorithm for convex–concave saddle-point problems
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- Prox-Method with Rate of Convergence O(1/t) for Variational Inequalities with Lipschitz Continuous Monotone Operators and Smooth Convex-Concave Saddle Point Problems
- A Modified Forward-Backward Splitting Method for Maximal Monotone Mappings
- A Primal-Dual Algorithm with Line Search for General Convex-Concave Saddle Point Problems
- Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization
- A Forward-Backward Splitting Method for Monotone Inclusions Without Cocoercivity
- Convergence Rate of $\mathcal{O}(1/k)$ for Optimistic Gradient and Extragradient Methods in Smooth Convex-Concave Saddle Point Problems
- Efficient Search of First-Order Nash Equilibria in Nonconvex-Concave Smooth Min-Max Problems
- An Accelerated Inexact Proximal Point Method for Solving Nonconvex-Concave Min-Max Problems
- Solving variational inequalities with Stochastic Mirror-Prox algorithm
- Proximally Guided Stochastic Subgradient Method for Nonsmooth, Nonconvex Problems
- Optimal Primal-Dual Methods for a Class of Saddle Point Problems
- Projected Reflected Gradient Methods for Monotone Variational Inequalities
- Proximité et dualité dans un espace hilbertien
- Fixed points of nonexpanding maps
- Convex Analysis
- Training GANs with centripetal acceleration
- Convex analysis and monotone operator theory in Hilbert spaces
- An optimal gradient method for smooth strongly convex minimization
- Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods
- Minimax Problems with Coupled Linear Constraints: Computational Complexity and Duality
- Understanding Nesterov's Acceleration via Proximal Point Method
- Accelerated minimax algorithms flock together
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
Related Items (1)
This page was built for publication: Accelerated minimax algorithms flock together
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6663115)