SpiderBoost and Momentum: Faster Stochastic Variance Reduction Algorithms

From MaRDI portal
Publication:6308746

arXiv1810.10690MaRDI QIDQ6308746

Author name not available (Why is that?)

Publication date: 24 October 2018

Abstract: SARAH and SPIDER are two recently developed stochastic variance-reduced algorithms, and SPIDER has been shown to achieve a near-optimal first-order oracle complexity in smooth nonconvex optimization. However, SPIDER uses an accuracy-dependent stepsize that slows down the convergence in practice, and cannot handle objective functions that involve nonsmooth regularizers. In this paper, we propose SpiderBoost as an improved scheme, which allows to use a much larger constant-level stepsize while maintaining the same near-optimal oracle complexity, and can be extended with proximal mapping to handle composite optimization (which is nonsmooth and nonconvex) with provable convergence guarantee. In particular, we show that proximal SpiderBoost achieves an oracle complexity of mathcalO(minn1/2epsilon2,epsilon3) in composite nonconvex optimization, improving the state-of-the-art result by a factor of mathcalO(minn1/6,epsilon1/3). We further develop a novel momentum scheme to accelerate SpiderBoost for composite optimization, which achieves the near-optimal oracle complexity in theory and substantial improvement in experiments.




Has companion code repository: https://github.com/SamuelHorvath/Variance_Reduced_Optimizers_Pytorch








This page was built for publication: SpiderBoost and Momentum: Faster Stochastic Variance Reduction Algorithms

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6308746)