Improved asynchronous parallel optimization analysis for stochastic incremental methods
From MaRDI portal
Publication:4614129
zbMath1478.68293arXiv1801.03749MaRDI QIDQ4614129
Fabian Pedregosa, Rémi Leblond, Simon Lacoste-Julien
Publication date: 30 January 2019
Full work available at URL: https://arxiv.org/abs/1801.03749
Convex programming (90C25) Learning and adaptive systems in artificial intelligence (68T05) Stochastic programming (90C15) Parallel algorithms in computer science (68W10) Computational aspects of data analysis and big data (68T09)
Related Items
Parallel and distributed asynchronous adaptive stochastic gradient methods ⋮ Fully asynchronous stochastic coordinate descent: a tight lower bound on the parallelism achieving linear speedup ⋮ Distributed Learning with Sparse Communications by Identification ⋮ Improved asynchronous parallel optimization analysis for stochastic incremental methods ⋮ ARock: An Algorithmic Framework for Asynchronous Parallel Coordinate Updates ⋮ Unnamed Item ⋮ Unnamed Item
Uses Software
Cites Work
- Unnamed Item
- Erratum to: ``Minimizing finite sums with the stochastic average gradient
- A Parallel Mixture of SVMs for Very Large Scale Problems
- Perturbed Iterate Analysis for Asynchronous Stochastic Optimization
- Distributed optimization with arbitrary local solvers
- Semi-stochastic coordinate descent
- Improved asynchronous parallel optimization analysis for stochastic incremental methods
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- An Asynchronous Parallel Stochastic Coordinate Descent Algorithm
- Stochastic gradient descent, weighted sampling, and the randomized Kaczmarz algorithm