Decentralized Stochastic Optimization and Gossip Algorithms with Compressed Communication

From MaRDI portal
Publication:6313427

arXiv1902.00340MaRDI QIDQ6313427

Author name not available (Why is that?)

Publication date: 1 February 2019

Abstract: We consider decentralized stochastic optimization with the objective function (e.g. data samples for machine learning task) being distributed over n machines that can only communicate to their neighbors on a fixed communication graph. To reduce the communication bottleneck, the nodes compress (e.g. quantize or sparsify) their model updates. We cover both unbiased and biased compression operators with quality denoted by omegaleq1 (omega=1 meaning no compression). We (i) propose a novel gossip-based stochastic gradient descent algorithm, CHOCO-SGD, that converges at rate mathcalOleft(1/(nT)+1/(Tdelta2omega)2ight) for strongly convex objectives, where T denotes the number of iterations and delta the eigengap of the connectivity matrix. Despite compression quality and network connectivity affecting the higher order terms, the first term in the rate, mathcalO(1/(nT)), is the same as for the centralized baseline with exact communication. We (ii) present a novel gossip algorithm, CHOCO-GOSSIP, for the average consensus problem that converges in time mathcalO(1/(delta2omega)log(1/epsilon)) for accuracy epsilon>0. This is (up to our knowledge) the first gossip algorithm that supports arbitrary compressed messages for omega>0 and still exhibits linear convergence. We (iii) show in experiments that both of our algorithms do outperform the respective state-of-the-art baselines and CHOCO-SGD can reduce communication by at least two orders of magnitudes.




Has companion code repository: https://github.com/Adirlou/OptML_Project

No records found.








This page was built for publication: Decentralized Stochastic Optimization and Gossip Algorithms with Compressed Communication

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6313427)