Deprecated: $wgMWOAuthSharedUserIDs=false is deprecated, set $wgMWOAuthSharedUserIDs=true, $wgMWOAuthSharedUserSource='local' instead [Called from MediaWiki\HookContainer\HookContainer::run in /var/www/html/w/includes/HookContainer/HookContainer.php at line 135] in /var/www/html/w/includes/Debug/MWDebug.php on line 372
Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds - MaRDI portal

Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds

From MaRDI portal
Publication:3465244

DOI10.1137/130950288zbMath1329.90108arXiv1312.5302OpenAlexW2592062427MaRDI QIDQ3465244

Dragos Clipici, Ion Necoara

Publication date: 21 January 2016

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1312.5302




Related Items (27)

Parallel block coordinate minimization with application to group regularized regressionConvergence results of a new monotone inertial forward-backward splitting algorithm under the local Hölder error bound conditionParallel random block-coordinate forward-backward algorithm: a unified convergence analysisAccelerated, Parallel, and Proximal Coordinate DescentAn Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk MinimizationDistributed Block Coordinate Descent for Minimizing Partially Separable FunctionsParallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error BoundsRandom block coordinate descent methods for linearly constrained optimization over networksExtended randomized Kaczmarz method for sparse least squares and impulsive noise problemsLinear convergence of first order methods for non-strongly convex optimizationRandomized Block Adaptive Linear System SolversRandom Coordinate Descent Methods for Nonseparable Composite OptimizationSynchronous parallel block coordinate descent method for nonsmooth convex function minimizationFaster convergence of a randomized coordinate descent method for linearly constrained optimization problemsA Randomized Coordinate Descent Method with Volume SamplingOn the linear convergence of the approximate proximal splitting method for non-smooth convex optimizationOn the complexity of parallel coordinate descentGlobal Convergence Rate of Proximal Incremental Aggregated Gradient MethodsRSG: Beating Subgradient Method without Smoothness and Strong ConvexityFaster Randomized Block Kaczmarz AlgorithmsOptimization in High Dimensions via Accelerated, Parallel, and Proximal Coordinate DescentVariational analysis perspective on linear convergence of some first order methods for nonsmooth convex optimization problemsCoordinate descent algorithmsLinear Convergence of Random Dual Coordinate Descent on Nonpolyhedral Convex ProblemsLinear convergence of prox-SVRG method for separable non-smooth convex optimization problems under bounded metric subregularityProximal Gradient Methods for Machine Learning and ImagingParallel coordinate descent methods for big data optimization


Uses Software


Cites Work


This page was built for publication: Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds