Decentralized online strongly convex optimization with general compressors and random disturbances
From MaRDI portal
Publication:6661694
DOI10.1007/s10957-024-02595-zMaRDI QIDQ6661694
Deming Yuan, Honglei Liu, Baoyong Zhang
Publication date: 13 January 2025
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Cites Work
- Primal recovery from consensus-based dual decomposition for distributed convex optimization
- Logarithmic regret algorithms for online convex optimization
- Optimal distributed stochastic mirror descent for strongly convex optimization
- A multi-scale method for distributed convex optimization with constraints
- Improving the convergence of distributed gradient descent via inexact average consensus
- An improved distributed gradient-push algorithm for bandwidth resource allocation over wireless local area network
- Privacy-preserving dual stochastic push-sum algorithm for distributed constrained optimization
- On Convergence Rate of Distributed Stochastic Gradient Algorithm for Convex Optimization with Inequality Constraints
- Online Distributed Convex Optimization on Dynamic Networks
- Fast Distributed Gradient Methods
- Distributed Wireless Sensor Network Localization Via Sequential Greedy Optimization Algorithm
- Distributed Sparse Linear Regression
- Distributed Subgradient Methods for Multi-Agent Optimization
- Convergence Rates of Distributed Gradient Methods Under Random Quantization: A Stochastic Approximation Approach
- Communication Compression for Distributed Nonconvex Optimization
- Quantized Distributed Gradient Tracking Algorithm With Linear Convergence in Directed Networks
- A Compressed Gradient Tracking Method for Decentralized Optimization With Linear Convergence
- SPARQ-SGD: Event-Triggered and Compressed Communication in Decentralized Optimization
- Gradient-Tracking-Based Distributed Optimization With Guaranteed Optimality Under Noisy Information Sharing
- Finite-Bit Quantization for Distributed Algorithms With Linear Convergence
- Innovation Compression for Communication-Efficient Distributed Optimization With Linear Convergence
This page was built for publication: Decentralized online strongly convex optimization with general compressors and random disturbances