On the rate of convergence of alternating minimization for non-smooth non-strongly convex optimization in Banach spaces
From MaRDI portal
Publication:2115326
DOI10.1007/S11590-021-01753-WzbMath1487.90522arXiv1911.00404OpenAlexW3165796876MaRDI QIDQ2115326
Publication date: 15 March 2022
Published in: Optimization Letters (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1911.00404
rate of convergenceBanach spacesconvex optimizationlinear convergencesublinear convergencealternating minimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Error bounds and convergence analysis of feasible descent methods: A general approach
- On the convergence of the block nonlinear Gauss-Seidel method under convex constraints
- Coordinate descent algorithms
- Restricted strong convexity and its applications to convergence analysis of gradient-type methods in convex optimization
- Linear convergence of first order methods for non-strongly convex optimization
- On the Convergence of Alternating Minimization for Convex Programming with Applications to Iteratively Reweighted Least Squares and Decomposition Schemes
- Rate of Convergence of Some Space Decomposition Methods for Linear and Nonlinear Problems
- Globally convergent block-coordinate techniques for unconstrained optimization
- DUNE — The Distributed and Unified Numerics Environment
- On the Convergence of Block Coordinate Descent Type Methods
This page was built for publication: On the rate of convergence of alternating minimization for non-smooth non-strongly convex optimization in Banach spaces