An Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization
From MaRDI portal
Publication:6419592
arXiv2212.02387MaRDI QIDQ6419592
Author name not available (Why is that?)
Publication date: 5 December 2022
Abstract: This paper studies the stochastic optimization for decentralized nonconvex-strongly-concave minimax problem. We propose a simple and efficient algorithm, called Decentralized Recursive-gradient descEnt Ascent Method ( exttt{DREAM}), which achieves the best-known theoretical guarantee for finding the -stationary point of the primal function. For the online setting, the proposed method requires stochastic first-order oracle (SFO) calls and communication rounds to find an -stationary point, where is the condition number and is the second-largest eigenvalue of the gossip matrix~. For the offline setting with totally component functions, the proposed method requires SFO calls and the same communication complexity as the online setting.
Has companion code repository: https://github.com/TrueNobility303/DREAM
This page was built for publication: An Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6419592)