ADD-OPT: Accelerated Distributed Directed Optimization
From MaRDI portal
Publication:5375217
zbMath1395.90204arXiv1607.04757MaRDI QIDQ5375217
Ran Xin, Chenguang Xi, Usman A. Khan
Publication date: 14 September 2018
Full work available at URL: https://arxiv.org/abs/1607.04757
Related Items
Robust Asynchronous Stochastic Gradient-Push: Asymptotically Optimal and Network-Independent Performance for Strongly Convex Functions, Distributed Optimization Based on Gradient Tracking Revisited: Enhancing Convergence Rate via Surrogation, A distributed methodology for approximate uniform global minimum sharing, Tracking-ADMM for distributed constraint-coupled optimization, An event-triggering algorithm for decentralized stochastic optimization over networks, An accelerated exact distributed first-order algorithm for optimization over directed networks, A stochastic averaging gradient algorithm with multi‐step communication for distributed optimization, A distributed accelerated optimization algorithm over time‐varying directed graphs with uncoordinated step‐sizes, Resilient consensus‐based distributed optimization under deception attacks, Distributed nonconvex constrained optimization over time-varying digraphs, Distributed online convex optimization with multiple coupled constraints: a double accelerated push-pull algorithm, Linear convergence of distributed estimation with constraints and communication delays, Dynamics based privacy preservation in decentralized optimization, A distributed algorithm for solving mixed equilibrium problems, Distributed decision-coupled constrained optimization via proximal-tracking, ADD-OPT, Unnamed Item, Surplus-based accelerated algorithms for distributed optimization over directed networks, Triggered gradient tracking for asynchronous distributed optimization, On the convergence of exact distributed generalisation and acceleration algorithm for convex optimisation