Two Symmetrized Coordinate Descent Methods Can Be $O(n^2)$ Times Slower Than the Randomized Version
From MaRDI portal
Publication:5162659
DOI10.1137/19M1292473zbMath1480.90196OpenAlexW3211024804MaRDI QIDQ5162659
Peijun Xiao, Zhisheng Xiao, Ruoyu Sun
Publication date: 5 November 2021
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/19m1292473
convex optimizationalternating direction method of multiplierscoordinate descentworst-case efficiency estimates
Analysis of algorithms and problem complexity (68Q25) Numerical mathematical programming methods (65K05) Convex programming (90C25)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- Tensor Decompositions and Applications
- Iteration complexity analysis of multi-block ADMM for a family of convex minimization without strong convexity
- Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization
- On the complexity analysis of randomized block-coordinate descent methods
- An efficient inexact symmetric Gauss-Seidel based majorized ADMM for high-dimensional convex composite conic programming
- On the linear convergence of the alternating direction method of multipliers
- On the sublinear convergence rate of multi-block ADMM
- Triangular truncation and finding the norm of a Hadamard multiplier
- A dual algorithm for the solution of nonlinear variational problems via finite element approximation
- Parallel multi-block ADMM with \(o(1/k)\) convergence
- A note on the alternating direction method of multipliers
- On the convergence analysis of the alternating direction method of multipliers with three blocks
- Randomness and permutations in coordinate descent methods
- Worst-case complexity of cyclic coordinate descent: \(O(n^2)\) gap with randomized version
- Why random reshuffling beats stochastic gradient descent
- Coordinate descent algorithms
- On the proximal Jacobian decomposition of ALM for multiple-block separable convex minimization problems and its relationship to ADMM
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Alternating Direction Method with Gaussian Back Substitution for Separable Convex Programming
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- A Convergent $3$-Block Semi-Proximal ADMM for Convex Minimization Problems with One Strongly Convex Block
- Randomized Methods for Linear Constraints: Convergence Rates and Conditioning
- An augmented Lagrangian based parallel splitting method for separable convex minimization with applications to image processing
- A Block Successive Upper-Bound Minimization Method of Multipliers for Linearly Constrained Convex Optimization
- Accelerated, Parallel, and Proximal Coordinate Descent
- An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization
- On Full Jacobian Decomposition of the Augmented Lagrangian Method for Separable Convex Programming
- On Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization
- An Iteratively Weighted MMSE Approach to Distributed Sum-Utility Maximization for a MIMO Interfering Broadcast Channel
- On the Efficiency of Random Permutation for ADMM and Coordinate Descent
- Analyzing random permutations for cyclic coordinate descent
- A Convergent 3-Block SemiProximal Alternating Direction Method of Multipliers for Conic Programming with 4-Type Constraints
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- An Asynchronous Parallel Stochastic Coordinate Descent Algorithm
- On the Global Linear Convergence of the ADMM with MultiBlock Variables
- Adjustment of an Inverse Matrix Corresponding to a Change in One Element of a Given Matrix
- Random permutations fix a worst case for cyclic coordinate descent
- Convergence of a block coordinate descent method for nondifferentiable minimization
- The direct extension of ADMM for multi-block convex minimization problems is not necessarily convergent
- A Schur complement based semi-proximal ADMM for convex quadratic conic programming and extensions
This page was built for publication: Two Symmetrized Coordinate Descent Methods Can Be $O(n^2)$ Times Slower Than the Randomized Version