TiAda: A Time-scale Adaptive Algorithm for Nonconvex Minimax Optimization

From MaRDI portal
Publication:6415683

arXiv2210.17478MaRDI QIDQ6415683

Xiang Li, Junchi Yang, Niao He

Publication date: 31 October 2022

Abstract: Adaptive gradient methods have shown their ability to adjust the stepsizes on the fly in a parameter-agnostic manner, and empirically achieve faster convergence for solving minimization problems. When it comes to nonconvex minimax optimization, however, current convergence analyses of gradient descent ascent (GDA) combined with adaptive stepsizes require careful tuning of hyper-parameters and the knowledge of problem-dependent parameters. Such a discrepancy arises from the primal-dual nature of minimax problems and the necessity of delicate time-scale separation between the primal and dual updates in attaining convergence. In this work, we propose a single-loop adaptive GDA algorithm called TiAda for nonconvex minimax optimization that automatically adapts to the time-scale separation. Our algorithm is fully parameter-agnostic and can achieve near-optimal complexities simultaneously in deterministic and stochastic settings of nonconvex-strongly-concave minimax problems. The effectiveness of the proposed method is further justified numerically for a number of machine learning applications.




Has companion code repository: https://github.com/ShawnLixx/time-scale_adaptive_minimax








This page was built for publication: TiAda: A Time-scale Adaptive Algorithm for Nonconvex Minimax Optimization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6415683)