Dualize, split, randomize: toward fast nonsmooth optimization algorithms
From MaRDI portal
Publication:2082232
DOI10.1007/s10957-022-02061-8zbMath1502.90133arXiv2004.02635OpenAlexW4285085550MaRDI QIDQ2082232
Konstantin Mishchenko, Adil Salim, Peter Richtárik, Laurent Condat
Publication date: 4 October 2022
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2004.02635
primal-dual algorithmminimizing the sum of three convex functionsstochastic generalizations of the algorithms
Related Items
Proximal Splitting Algorithms for Convex Optimization: A Tour of Recent Advances, with New Twists ⋮ Quadratic error bound of the smoothed gap and the restarted averaged primal-dual hybrid gradient ⋮ Bregman three-operator splitting methods
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- Nonlinear total variation based noise removal algorithms
- Principal component-guided sparse regression
- On the ergodic convergence rates of a first-order primal-dual algorithm
- A primal-dual splitting method for convex optimization involving Lipschitzian, proximable and linear composite terms
- Primal-dual splitting algorithm for solving inclusions with mixtures of composite, Lipschitzian, and parallel-sum type monotone operators
- A three-operator splitting scheme and its optimization applications
- Lectures on convex optimization
- On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators
- A dual algorithm for the solution of nonlinear variational problems via finite element approximation
- A new primal-dual algorithm for minimizing the sum of three functions with a linear operator
- A simple algorithm for a class of nonsmooth convex-concave saddle-point problems
- Proximal algorithms in statistics and machine learning
- A first-order primal-dual algorithm for convex problems with applications to imaging
- A splitting algorithm for dual monotone inclusions involving cocoercive operators
- Single-forward-step projective splitting: exploiting cocoercivity
- Projective splitting with forward steps
- Uniqueness of DRS as the 2 operator resolvent-splitting and impossibility of 3 operator resolvent-splitting
- On the equivalence of the primal-dual hybrid gradient method and Douglas-Rachford splitting
- First-order and stochastic optimization methods for machine learning
- Coordinate descent algorithms
- Asynchronous block-iterative primal-dual decomposition methods for monotone inclusions
- A family of projective splitting methods for the sum of two maximal monotone operators
- Collaborative Total Variation: A General Framework for Vectorial TV Models
- DSA: Decentralized Double Stochastic Averaging Gradient Algorithm
- Dual Constrained TV-based Regularization on Graphs
- Optimization with Sparsity-Inducing Penalties
- On Weak Convergence of the Douglas–Rachford Method
- On a generalization of the iterative soft-thresholding algorithm for the case of non-separable penalty
- Discrete Total Variation: New Definition and Minimization
- Revisiting EXTRA for Smooth Distributed Optimization
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- First-Order Methods in Optimization
- A primal–dual fixed point algorithm for convex separable minimization with applications to image restoration
- Recent Developments on Primal–Dual Splitting Methods with Applications to Convex Minimization
- Decentralized Proximal Gradient Algorithms With Linear Convergence Rates
- Distributed Algorithms for Composite Optimization: Unified Framework and Convergence Analysis
- Fixed Point Strategies in Data Science
- Proximal Activation of Smooth Functions in Splitting Algorithms for Convex Image Recovery
- Snake: A Stochastic Proximal Gradient Algorithm for Regularized Problems Over Large Graphs
- Convergence Rates for Projective Splitting
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- Solving Coupled Composite Monotone Inclusions by Successive Fejér Approximations of their Kuhn--Tucker Set
- EXTRA: An Exact First-Order Algorithm for Decentralized Consensus Optimization
- An introduction to continuous optimization for imaging
- Total Generalized Variation
- Convex analysis and monotone operator theory in Hilbert spaces
- Sparse Image and Signal Processing