Deprecated: $wgMWOAuthSharedUserIDs=false is deprecated, set $wgMWOAuthSharedUserIDs=true, $wgMWOAuthSharedUserSource='local' instead [Called from MediaWiki\HookContainer\HookContainer::run in /var/www/html/w/includes/HookContainer/HookContainer.php at line 135] in /var/www/html/w/includes/Debug/MWDebug.php on line 372
Stochastic Optimization with Heavy-Tailed Noise via Accelerated Gradient Clipping - MaRDI portal

Stochastic Optimization with Heavy-Tailed Noise via Accelerated Gradient Clipping

From MaRDI portal
Publication:6341185

arXiv2005.10785MaRDI QIDQ6341185

Author name not available (Why is that?)

Publication date: 21 May 2020

Abstract: In this paper, we propose a new accelerated stochastic first-order method called clipped-SSTM for smooth convex stochastic optimization with heavy-tailed distributed noise in stochastic gradients and derive the first high-probability complexity bounds for this method closing the gap in the theory of stochastic optimization with heavy-tailed noise. Our method is based on a special variant of accelerated Stochastic Gradient Descent (SGD) and clipping of stochastic gradients. We extend our method to the strongly convex case and prove new complexity bounds that outperform state-of-the-art results in this case. Finally, we extend our proof technique and derive the first non-trivial high-probability complexity bounds for SGD with clipping without light-tails assumption on the noise.




Has companion code repository: https://github.com/eduardgorbunov/accelerated_clipping








This page was built for publication: Stochastic Optimization with Heavy-Tailed Noise via Accelerated Gradient Clipping

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6341185)