Projected subgradient based distributed convex optimization with transmission noises
From MaRDI portal
Publication:2073071
DOI10.1016/j.amc.2021.126794OpenAlexW3216242265MaRDI QIDQ2073071
Publication date: 27 January 2022
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.amc.2021.126794
additive noisedistributed convex optimizationprojected subgradient algorithmpolyhedric set constraintrandom inner space
Mathematical programming (90Cxx) Stochastic systems and control (93Exx) General systems theory (93Axx)
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Distributed constrained optimal consensus of multi-agent systems
- How to differentiate the projection on a convex set in Hilbert space. Some applications to variational inequalities
- Convergence rate analysis of distributed optimization with projected subgradient algorithm
- Consensus conditions of continuous-time multi-agent systems with time-delays and measurement noises
- Distributed stochastic gradient tracking methods
- Exponential convergence of distributed primal-dual convex optimization algorithm without strong convexity
- Distributed approximate Newton algorithms and weight design for constrained optimization
- Primal-dual stochastic distributed algorithm for constrained convex optimization
- On the Convergence of Decentralized Gradient Descent
- On Convergence Rate of Distributed Stochastic Gradient Algorithm for Convex Optimization with Inequality Constraints
- Revisiting EXTRA for Smooth Distributed Optimization
- A Randomized Incremental Subgradient Method for Distributed Optimization in Networked Systems
- Distributed Averaging With Random Network Graphs and Noises
- DEXTRA: A Fast Algorithm for Optimization Over Directed Graphs
- Consensus Conditions of Continuous-Time Multi-Agent Systems with Additive and Multiplicative Measurement Noises
- Exact Diffusion for Distributed Optimization and Learning—Part II: Convergence Analysis
- Randomized Block Proximal Methods for Distributed Stochastic Big-Data Optimization
- Distributed Big-Data Optimization via Blockwise Gradient Tracking
- Accelerated Distributed Nesterov Gradient Descent
- Fenchel Dual Gradient Methods for Distributed Convex Optimization Over Time-Varying Networks
- Distributed Continuous-Time and Discrete-Time Optimization With Nonuniform Unbounded Convex Constraint Sets and Nonuniform Stepsizes
- Distributed Coupled Multiagent Stochastic Optimization
- Consensus Problems in Networks of Agents With Switching Topology and Time-Delays
This page was built for publication: Projected subgradient based distributed convex optimization with transmission noises