Adaptive Douglas--Rachford Splitting Algorithm from a Yosida Approximation Standpoint
From MaRDI portal
Publication:5010046
DOI10.1137/20M131388XOpenAlexW3191012382MaRDI QIDQ5010046
Publication date: 24 August 2021
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/20m131388x
resolventlinear convergenceforward-backward algorithmYosida approximationweak monotonicityinclusion problemcomonotoneadaptive Douglas-Rachford algorithm
Numerical mathematical programming methods (65K05) Numerical optimization and variational techniques (65K10) Fixed-point theorems (47H10) Rate of convergence, degree of approximation (41A25) Decomposition methods (49M27)
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- A Newton-type curvilinear search method for optimization
- Tight global linear convergence rate bounds for Douglas-Rachford splitting
- Demiclosedness principles for generalized nonexpansive mappings
- Multiplier and gradient methods
- On Weak Convergence of the Douglas–Rachford Method
- The Numerical Solution of Parabolic and Elliptic Differential Equations
- On the Numerical Solution of Heat Conduction Problems in Two and Three Space Variables
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- Monotone Operators and the Proximal Point Algorithm
- Augmented Lagrangians and Applications of the Proximal Point Algorithm in Convex Programming
- Proximal Methods for Cohypomonotone Operators
- Adaptive Douglas--Rachford Splitting Algorithm for the Sum of Two Operators
- Convergence Analysis of Douglas--Rachford Splitting Method for “Strongly + Weakly” Convex Programming
- Convex programming in Hilbert space
- Functional Operators (AM-22), Volume 2
- Convex analysis and monotone operator theory in Hilbert spaces