Additive Schwarz methods for convex optimization -- convergence theory and acceleration
From MaRDI portal
Publication:6630175
DOI10.1007/978-3-030-95025-5_78MaRDI QIDQ6630175
Publication date: 30 October 2024
Multigrid methods; domain decomposition for boundary value problems involving PDEs (65N55) Multigrid methods; domain decomposition for initial value and initial-boundary value problems involving PDEs (65M55)
Cites Work
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Gradient methods for minimizing composite functions
- A simplified view of first order methods for optimization
- Rate of convergence for some constraint decomposition methods for nonlinear variational inequalities
- Pseudo-linear convergence of an additive Schwarz method for dual total variation minimization
- Adaptive restart for accelerated gradient schemes
- Global and uniform convergence of subspace correction methods for some convex optimization problems
- Additive Schwarz Methods for Convex Optimization as Gradient Methods
- Sharpness, Restart, and Acceleration
- Backtracking Strategies for Accelerated Descent Methods with Smooth Composite Objectives
- Convergence Rate of Overlapping Domain Decomposition Methods for the Rudin--Osher--Fatemi Model Based on a Dual Formulation
- An introduction to continuous optimization for imaging
This page was built for publication: Additive Schwarz methods for convex optimization -- convergence theory and acceleration