Asynchronous fully-decentralized SGD in the cluster-based model
From MaRDI portal
Publication:6057314
DOI10.1007/978-3-031-30448-4_5arXiv2202.10862MaRDI QIDQ6057314
Publication date: 4 October 2023
Published in: Lecture Notes in Computer Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2202.10862
stochastic gradient descentdistributed learningasynchronous computingcluster-based modelmulti-dimensional approximate agreement
Cites Work
- Unnamed Item
- Unnamed Item
- Multidimensional agreement in Byzantine systems
- Reaching approximate agreement in the presence of faults
- A per letter converse to the channel coding theorem
- Optimization Methods for Large-Scale Machine Learning
- Fast Multidimensional Asymptotic and Approximate Consensus
- The Convergence of Stochastic Gradient Descent in Asynchronous Shared Memory
- Information-Theoretic Lower Bounds on the Oracle Complexity of Stochastic Convex Optimization
- Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
- A Stochastic Approximation Method
- Genuinely Distributed Byzantine Machine Learning
This page was built for publication: Asynchronous fully-decentralized SGD in the cluster-based model