Deprecated: $wgMWOAuthSharedUserIDs=false is deprecated, set $wgMWOAuthSharedUserIDs=true, $wgMWOAuthSharedUserSource='local' instead [Called from MediaWiki\HookContainer\HookContainer::run in /var/www/html/w/includes/HookContainer/HookContainer.php at line 135] in /var/www/html/w/includes/Debug/MWDebug.php on line 372
Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties - MaRDI portal

Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties

From MaRDI portal
Publication:2954387

DOI10.1137/140961134zbMath1358.90098arXiv1403.3862OpenAlexW2079482358MaRDI QIDQ2954387

Stephen J. Wright, Ji Liu

Publication date: 13 January 2017

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1403.3862



Related Items

Parallel block coordinate minimization with application to group regularized regression, Primal-dual algorithms for multi-agent structured optimization over message-passing architectures with bounded communication delays, An asynchronous inertial algorithm for solving convex feasibility problems with strict pseudo-contractions in Hilbert spaces, A Subspace Acceleration Method for Minimization Involving a Group Sparsity-Inducing Regularizer, Random block coordinate descent methods for linearly constrained optimization over networks, On unbounded delays in asynchronous parallel fixed-point algorithms, Perturbed Iterate Analysis for Asynchronous Stochastic Optimization, Parameter estimation in a 3‐parameter p‐star random graph model, On the convergence of asynchronous parallel iteration with unbounded delays, Zeroth-order feedback optimization for cooperative multi-agent systems, A new large-scale learning algorithm for generalized additive models, On the Global Convergence of Randomized Coordinate Gradient Descent for Nonconvex Optimization, Convergence of an asynchronous block-coordinate forward-backward algorithm for convex composite optimization, Synchronous parallel block coordinate descent method for nonsmooth convex function minimization, Sample Complexity of Sample Average Approximation for Conditional Stochastic Optimization, Asynchronous parallel algorithms for nonconvex optimization, Stochastic block-coordinate gradient projection algorithms for submodular maximization, Asynchronous parallel primal-dual block coordinate update methods for affinely constrained convex programs, Parallel Stochastic Asynchronous Coordinate Descent: Tight Bounds on the Possible Parallelism, Fully asynchronous stochastic coordinate descent: a tight lower bound on the parallelism achieving linear speedup, The restricted strong convexity revisited: analysis of equivalence to error bound and quadratic growth, Unnamed Item, New analysis of linear convergence of gradient-type methods via unifying error bound conditions, Markov chain block coordinate descent, ARock: An Algorithmic Framework for Asynchronous Parallel Coordinate Updates, Linear Convergence of Descent Methods for the Unconstrained Minimization of Restricted Strongly Convex Functions, Coordinate descent with arbitrary sampling I: algorithms and complexity, Coordinate descent with arbitrary sampling II: expected separable overapproximation, CoordinateWise Descent Methods for Leading Eigenvalue Problem, Accelerate stochastic subgradient method by leveraging local growth condition, Accelerating Stochastic Composition Optimization, An inertial parallel and asynchronous forward-backward iteration for distributed convex optimization, Unnamed Item, Iteration Complexity of a Block Coordinate Gradient Descent Method for Convex Optimization, An accelerated communication-efficient primal-dual optimization framework for structured machine learning, Coordinate descent algorithms, Distributed Stochastic Optimization with Large Delays


Uses Software


Cites Work