Perturbed Iterate Analysis for Asynchronous Stochastic Optimization
From MaRDI portal
Publication:4588862
DOI10.1137/16M1057000zbMath1376.65096arXiv1507.06970OpenAlexW2962952793MaRDI QIDQ4588862
Kannan Ramchandran, Benjamin Recht, Horia Mania, Xinghao Pan, Dimitris S. Papailiopoulos, Michael I. Jordan
Publication date: 3 November 2017
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1507.06970
convergencestochastic optimizationnumerical examplesasynchronous algorithmsparallel machine learningsparse stochastic variance-reduced gradient algorithm
Numerical mathematical programming methods (65K05) Learning and adaptive systems in artificial intelligence (68T05) Stochastic programming (90C15) Parallel numerical computation (65Y05)
Related Items
Parallel and distributed asynchronous adaptive stochastic gradient methods, On the convergence analysis of asynchronous SGD for solving consistent linear systems, Nonlinear Gradient Mappings and Stochastic Optimization: A General Framework with Applications to Heavy-Tail Noise, Asynchronous parallel algorithms for nonconvex optimization, Parallel Stochastic Asynchronous Coordinate Descent: Tight Bounds on the Possible Parallelism, Fully asynchronous stochastic coordinate descent: a tight lower bound on the parallelism achieving linear speedup, Incremental without replacement sampling in nonconvex optimization, Improved asynchronous parallel optimization analysis for stochastic incremental methods, A robust multi-batch L-BFGS method for machine learning, Unnamed Item, Unnamed Item, Unnamed Item, On the convergence analysis of aggregated heavy-ball method, Distributed Stochastic Optimization with Large Delays
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Parallel coordinate descent methods for big data optimization
- Chaotic relaxation
- ARock: An Algorithmic Framework for Asynchronous Parallel Coordinate Updates
- An optimal algorithm for stochastic strongly-convex optimization
- Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties
- An Asynchronous Mini-Batch Algorithm for Regularized Stochastic Optimization
- Revisiting Asynchronous Linear Solvers
- Distributed asynchronous deterministic and stochastic gradient optimization algorithms
- An Asynchronous Parallel Stochastic Coordinate Descent Algorithm