Approximate first-order primal-dual algorithms for saddle point problems
From MaRDI portal
Publication:5856742
DOI10.1090/mcom/3610zbMath1461.65159OpenAlexW3096661406MaRDI QIDQ5856742
Zhongming Wu, Deren Han, Fan Jiang, Xing-Ju Cai
Publication date: 29 March 2021
Published in: Mathematics of Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1090/mcom/3610
global convergenceconvex optimizationsaddle point problemsfirst-order primal-dual algorithminexact criteria
Numerical mathematical programming methods (65K05) Convex programming (90C25) Numerical optimization and variational techniques (65K10)
Related Items
On the linear convergence of the general first order primal-dual algorithm ⋮ Unified linear convergence of first-order primal-dual algorithms for saddle point problems ⋮ A partially inexact generalized primal-dual hybrid gradient method for saddle point problems with bilinear couplings ⋮ Understanding the convergence of the preconditioned PDHG method: a view of indefinite proximal ADMM ⋮ A second order primal-dual dynamical system for a convex-concave bilinear saddle point problem ⋮ Inexact generalized ADMM with relative error criteria for linearly constrained convex optimization problems ⋮ An alternative extrapolation scheme of PDHGM for saddle point problem with nonlinear function ⋮ A first-order inexact primal-dual algorithm for a class of convex-concave saddle point problems ⋮ A primal-dual flow for affine constrained convex optimization
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nonlinear total variation based noise removal algorithms
- On the ergodic convergence rates of a first-order primal-dual algorithm
- A primal-dual splitting method for convex optimization involving Lipschitzian, proximable and linear composite terms
- A practical relative error criterion for augmented Lagrangians
- An improved first-order primal-dual algorithm with a new correction step
- An inexact alternating direction method of multipliers with relative error criteria
- A unified primal-dual algorithm framework based on Bregman iteration
- A three-operator splitting scheme and its optimization applications
- A primal-dual prediction-correction algorithm for saddle point optimization
- A reduced Newton method for constrained linear least-squares problems
- A dual algorithm for the solution of nonlinear variational problems via finite element approximation
- An algorithm for total variation minimization and applications
- Error bounds for proximal point subproblems and associated inexact proximal point algorithms
- Relative-error approximate versions of Douglas-Rachford splitting and special cases of the ADMM
- Approximate ADMM algorithms derived from Lagrangian splitting
- An algorithmic framework of generalized primal-dual hybrid gradient methods for saddle point problems
- On inexact ADMMs with relative error criteria
- The cosparse analysis model and algorithms
- Projected Barzilai-Borwein methods for large-scale box-constrained quadratic programming
- A new primal-dual algorithm for minimizing the sum of three functions with a linear operator
- A first-order primal-dual algorithm for convex problems with applications to imaging
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- A splitting algorithm for dual monotone inclusions involving cocoercive operators
- Acceleration of primal-dual methods by preconditioning and simple subproblem procedures
- Inexact first-order primal-dual algorithms
- A customized Douglas-Rachford splitting algorithm for separable convex minimization with linear constraints
- Convergence Analysis of Primal-Dual Algorithms for a Saddle-Point Problem: From Contraction Perspective
- A General Framework for a Class of First Order Primal-Dual Algorithms for Convex Optimization in Imaging Science
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- Smoothing and Decomposition for Analysis Sparse Recovery
- A First-Order Primal-Dual Algorithm with Linesearch
- On the Convergence of Primal-Dual Hybrid Gradient Algorithm