On the Convergence of Stochastic Primal-Dual Hybrid Gradient
From MaRDI portal
Publication:5081780
DOI10.1137/19M1296252zbMath1494.90075arXiv1911.00799OpenAlexW2987616811MaRDI QIDQ5081780
Volkan Cevher, Ahmet Alacaoglu, Olivier Fercoq
Publication date: 17 June 2022
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1911.00799
Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Minimax problems in mathematical programming (90C47) Numerical optimization and variational techniques (65K10) Discrete approximations in optimal control (49M25)
Related Items (5)
A new randomized primal-dual algorithm for convex optimization with fast last iterate convergence rates ⋮ Simple and Optimal Methods for Stochastic Variational Inequalities, I: Operator Extrapolation ⋮ Cyclic Coordinate Dual Averaging with Extrapolation ⋮ Faster first-order primal-dual methods for linear programming using restarts and sharpness ⋮ Quadratic error bound of the smoothed gap and the restarted averaged primal-dual hybrid gradient
Uses Software
Cites Work
- Unnamed Item
- On the ergodic convergence rates of a first-order primal-dual algorithm
- Convergence rates with inexact non-expansive operators
- Incremental proximal methods for large scale convex optimization
- Stochastic quasi-Fejér block-coordinate fixed point iterations with random sweeping. II: Mean-square and linear convergence
- An optimal randomized incremental gradient method
- A first-order primal-dual algorithm for convex problems with applications to imaging
- An adaptive primal-dual framework for nonsmooth convex minimization
- Primal-dual proximal algorithms for structured convex optimization: a unifying framework
- Block-coordinate primal-dual method for nonsmooth minimization over linear constraints
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Atomic Decomposition by Basis Pursuit
- A Class of Randomized Primal-Dual Algorithms for Distributed Optimization
- Linear Convergence of the Alternating Direction Method of Multipliers for a Class of Convex Optimization Problems
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Convergence Analysis of Primal-Dual Algorithms for a Saddle-Point Problem: From Contraction Perspective
- A General Framework for a Class of First Order Primal-Dual Algorithms for Convex Optimization in Imaging Science
- Partial Error Bound Conditions and the Linear Convergence Rate of the Alternating Direction Method of Multipliers
- Accelerated, Parallel, and Proximal Coordinate Descent
- Implicit Functions and Solution Mappings
- Robust Stochastic Approximation Approach to Stochastic Programming
- Nonasymptotic convergence of stochastic proximal point algorithms for constrained convex optimization
- PhaseMax: Convex Phase Retrieval via Basis Pursuit
- A Smooth Primal-Dual Optimization Framework for Nonsmooth Composite Convex Minimization
- Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization
- A Coordinate-Descent Primal-Dual Algorithm with Large Step Size and Possibly Nonseparable Functions
- Random Gradient Extrapolation for Distributed and Stochastic Optimization
- Stochastic Primal-Dual Hybrid Gradient Algorithm with Arbitrary Sampling and Imaging Applications
- Primal-Dual Stochastic Gradient Method for Convex Programs with Many Functional Constraints
- A New Randomized Block-Coordinate Primal-Dual Proximal Algorithm for Distributed Optimization
- Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
- Randomized Projection Methods for Convex Feasibility: Conditioning and Convergence Rates
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- Stochastic Quasi-Fejér Block-Coordinate Fixed Point Iterations with Random Sweeping
- A Stochastic Approximation Method
- Convex analysis and monotone operator theory in Hilbert spaces
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
This page was built for publication: On the Convergence of Stochastic Primal-Dual Hybrid Gradient