Distributed Heavy-Ball: A Generalization and Acceleration of First-Order Methods With Gradient Tracking
From MaRDI portal
Publication:5125684
DOI10.1109/TAC.2019.2942513MaRDI QIDQ5125684
Publication date: 7 October 2020
Published in: IEEE Transactions on Automatic Control (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1808.02942
Related Items (15)
Distributed adaptive Newton methods with global superlinear convergence ⋮ Blended dynamics approach to distributed optimization: sum convexity and convergence rate ⋮ Distributed Optimization Based on Gradient Tracking Revisited: Enhancing Convergence Rate via Surrogation ⋮ An event-triggering algorithm for decentralized stochastic optimization over networks ⋮ An accelerated exact distributed first-order algorithm for optimization over directed networks ⋮ A distributed accelerated optimization algorithm over time‐varying directed graphs with uncoordinated step‐sizes ⋮ Linear convergence rate analysis of a class of exact first-order distributed methods for weight-balanced time-varying networks and uncoordinated step sizes ⋮ Heavy-ball-based optimal thresholding algorithms for sparse linear inverse problems ⋮ Distributed online convex optimization with multiple coupled constraints: a double accelerated push-pull algorithm ⋮ Heavy-ball-based hard thresholding algorithms for sparse signal recovery ⋮ ET-PDA: an event-triggered parameter distributed accelerated algorithm for economic dispatch problems ⋮ EFIX: exact fixed point methods for distributed optimization ⋮ Distributed stochastic gradient tracking methods with momentum acceleration for non-convex optimization ⋮ Unnamed Item ⋮ An accelerated distributed gradient method with local memory
This page was built for publication: Distributed Heavy-Ball: A Generalization and Acceleration of First-Order Methods With Gradient Tracking