Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties
From MaRDI portal
Publication:2954387
DOI10.1137/140961134zbMath1358.90098arXiv1403.3862OpenAlexW2079482358MaRDI QIDQ2954387
Publication date: 13 January 2017
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1403.3862
Convex programming (90C25) Linear programming (90C05) Parallel algorithms in computer science (68W10) Randomized algorithms (68W20)
Related Items
Parallel block coordinate minimization with application to group regularized regression, Primal-dual algorithms for multi-agent structured optimization over message-passing architectures with bounded communication delays, An asynchronous inertial algorithm for solving convex feasibility problems with strict pseudo-contractions in Hilbert spaces, A Subspace Acceleration Method for Minimization Involving a Group Sparsity-Inducing Regularizer, Random block coordinate descent methods for linearly constrained optimization over networks, On unbounded delays in asynchronous parallel fixed-point algorithms, Perturbed Iterate Analysis for Asynchronous Stochastic Optimization, Parameter estimation in a 3‐parameter p‐star random graph model, On the convergence of asynchronous parallel iteration with unbounded delays, Zeroth-order feedback optimization for cooperative multi-agent systems, A new large-scale learning algorithm for generalized additive models, On the Global Convergence of Randomized Coordinate Gradient Descent for Nonconvex Optimization, Convergence of an asynchronous block-coordinate forward-backward algorithm for convex composite optimization, Synchronous parallel block coordinate descent method for nonsmooth convex function minimization, Sample Complexity of Sample Average Approximation for Conditional Stochastic Optimization, Asynchronous parallel algorithms for nonconvex optimization, Stochastic block-coordinate gradient projection algorithms for submodular maximization, Asynchronous parallel primal-dual block coordinate update methods for affinely constrained convex programs, Parallel Stochastic Asynchronous Coordinate Descent: Tight Bounds on the Possible Parallelism, Fully asynchronous stochastic coordinate descent: a tight lower bound on the parallelism achieving linear speedup, The restricted strong convexity revisited: analysis of equivalence to error bound and quadratic growth, Unnamed Item, New analysis of linear convergence of gradient-type methods via unifying error bound conditions, Markov chain block coordinate descent, ARock: An Algorithmic Framework for Asynchronous Parallel Coordinate Updates, Linear Convergence of Descent Methods for the Unconstrained Minimization of Restricted Strongly Convex Functions, Coordinate descent with arbitrary sampling I: algorithms and complexity†, Coordinate descent with arbitrary sampling II: expected separable overapproximation, CoordinateWise Descent Methods for Leading Eigenvalue Problem, Accelerate stochastic subgradient method by leveraging local growth condition, Accelerating Stochastic Composition Optimization, An inertial parallel and asynchronous forward-backward iteration for distributed convex optimization, Unnamed Item, Iteration Complexity of a Block Coordinate Gradient Descent Method for Convex Optimization, An accelerated communication-efficient primal-dual optimization framework for structured machine learning, Coordinate descent algorithms, Distributed Stochastic Optimization with Large Delays
Uses Software
Cites Work
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Parallel coordinate descent methods for big data optimization
- A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints
- On the complexity analysis of randomized block-coordinate descent methods
- A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training
- A coordinate gradient descent method for nonsmooth separable minimization
- Convergence of sequential and asynchronous nonlinear paracontractions
- On the convergence of the coordinate descent method for convex differentiable minimization
- Introductory lectures on convex optimization. A basic course.
- On asynchronous iterations
- Support-vector networks
- Smooth minimization of nonsmooth functions with parallel coordinate descent methods
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Augmented $\ell_1$ and Nuclear-Norm Models with a Globally Linearly Convergent Algorithm
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Fast Multiple-Splitting Algorithms for Convex Optimization
- Revisiting Asynchronous Linear Solvers
- Accelerated, Parallel, and Proximal Coordinate Descent
- Robust Stochastic Approximation Approach to Stochastic Programming
- Parallel Variable Distribution
- Degenerate Nonlinear Programming with a Quadratic Growth Condition
- Sparse Reconstruction by Separable Approximation
- Parallel Selective Algorithms for Nonconvex Big Data Optimization
- Parallel Gradient Distribution in Unconstrained Optimization
- On the Nonasymptotic Convergence of Cyclic Coordinate Descent Methods
- Dual Averaging for Distributed Optimization: Convergence Analysis and Network Scaling
- On the Convergence of Block Coordinate Descent Type Methods
- An Asynchronous Parallel Stochastic Coordinate Descent Algorithm
- Convergence of a block coordinate descent method for nondifferentiable minimization