Additive Schwarz Methods for Convex Optimization as Gradient Methods
From MaRDI portal
Publication:5110553
DOI10.1137/19M1300583zbMath1434.65302arXiv1912.03617MaRDI QIDQ5110553
Publication date: 20 May 2020
Published in: SIAM Journal on Numerical Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1912.03617
convergence analysisconvex optimizationgradient methoddomain decomposition methodadditive Schwarz method
Multigrid methods; domain decomposition for boundary value problems involving PDEs (65N55) Convex programming (90C25) Numerical solutions to equations with nonlinear operators (65J15) Numerical methods for variational inequalities and related problems (65K15)
Related Items (4)
Additive Schwarz methods for convex optimization with backtracking ⋮ Accelerated additive Schwarz methods for convex optimization with adaptive restart ⋮ Fast gradient methods for uniformly convex and weakly smooth problems ⋮ Preconditioning for finite element methods with strain smoothing
Cites Work
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Gradient methods for minimizing composite functions
- Rate of convergence for some constraint decomposition methods for nonlinear variational inequalities
- On the abstract theory of additive and multiplicative Schwarz algorithms
- A finite element nonoverlapping domain decomposition method with Lagrange multipliers for the dual total variation minimizations
- Preconditioned descent algorithms for \(p\)-Laplacian
- A block symmetric Gauss-Seidel decomposition theorem for convex composite quadratic programming and its applications
- Discrete total variation with finite elements and applications to imaging
- One- and two-level Schwarz methods for variational inequalities of the second kind and their application to frictional contact
- An Algebraic Convergence Theory for Restricted Additive Schwarz Methods Using Weighted Max Norms
- Global and uniform convergence of subspace correction methods for some convex optimization problems
- A Block Coordinate Descent Method for Regularized Multiconvex Optimization with Applications to Nonnegative Tensor Factorization and Completion
- On the Convergence of Alternating Minimization for Convex Programming with Applications to Iteratively Reweighted Least Squares and Decomposition Schemes
- Accelerated, Parallel, and Proximal Coordinate Descent
- Iterative Methods by Space Decomposition and Subspace Correction
- Rate of Convergence of Some Space Decomposition Methods for Linear and Nonlinear Problems
- The method of alternating projections and the method of subspace corrections in Hilbert space
- Convergence Rate Analysis of a Multiplicative Schwarz Method for Variational Inequalities
- An additive Schwarz method for variational inequalities
- Total Bounded Variation Regularization as a Bilaterally Constrained Optimization Problem
- An Overlapping Schwarz Algorithm for Raviart--Thomas Vector Fields with Discontinuous Coefficients
- Fast Nonoverlapping Block Jacobi Method for the Dual Rudin--Osher--Fatemi Model
- Convergence Rate of Overlapping Domain Decomposition Methods for the Rudin--Osher--Fatemi Model Based on a Dual Formulation
- An $L^1$ Penalty Method for General Obstacle Problems
- On the Convergence of Block Coordinate Descent Type Methods
- An Algorithm for Splitting Parallel Sums of Linearly Composed Monotone Operators, with Applications to Signal Recovery
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- Convergence Rate of a Schwarz Multilevel Method for the Constrained Minimization of Nonquadratic Functionals
- Primal Domain Decomposition Methods for the Total Variation Minimization, Based on Dual Decomposition
- An introduction to continuous optimization for imaging
- Convex analysis and monotone operator theory in Hilbert spaces
- Convergence of a block coordinate descent method for nondifferentiable minimization
This page was built for publication: Additive Schwarz Methods for Convex Optimization as Gradient Methods