Iteration Complexity of a Block Coordinate Gradient Descent Method for Convex Optimization
From MaRDI portal
Publication:5501228
DOI10.1137/140964795zbMath1317.65137OpenAlexW1579576445MaRDI QIDQ5501228
Publication date: 3 August 2015
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/140964795
nonsmooth optimizationconvex optimizationiteration complexityblock coordinate gradient descent method
Numerical mathematical programming methods (65K05) Convex programming (90C25) Nonlinear programming (90C30)
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- On the complexity analysis of randomized block-coordinate descent methods
- Iteration complexity analysis of block coordinate descent methods
- Approximation accuracy, gradient methods, and error bound for structured convex optimization
- A coordinate gradient descent method for nonsmooth separable minimization
- Efficient block-coordinate descent algorithms for the group Lasso
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties
- The Group Lasso for Logistic Regression
- On the Nonasymptotic Convergence of Cyclic Coordinate Descent Methods
- On the Convergence of Block Coordinate Descent Type Methods
- Model Selection and Estimation in Regression with Grouped Variables
- Convex Analysis
- Convergence of a block coordinate descent method for nondifferentiable minimization