Stochastic Variance-Reduced Newton: Accelerating Finite-Sum Minimization with Large Batches

From MaRDI portal
Publication:6401269

arXiv2206.02702MaRDI QIDQ6401269

Author name not available (Why is that?)

Publication date: 6 June 2022

Abstract: Stochastic variance reduction has proven effective at accelerating first-order algorithms for solving convex finite-sum optimization tasks such as empirical risk minimization. Incorporating additional second-order information has proven helpful in further improving the performance of these first-order methods. However, comparatively little is known about the benefits of using variance reduction to accelerate popular stochastic second-order methods such as Subsampled Newton. To address this, we propose Stochastic Variance-Reduced Newton (SVRN), a finite-sum minimization algorithm which enjoys all the benefits of second-order methods: simple unit step size, easily parallelizable large-batch operations, and fast local convergence, while at the same time taking advantage of variance reduction to achieve improved convergence rates (per data pass) for smooth and strongly convex problems. We show that SVRN can accelerate many stochastic second-order methods (such as Subsampled Newton) as well as iterative least squares solvers (such as Iterative Hessian Sketch), and it compares favorably to popular first-order methods with variance reduction.




Has companion code repository: https://github.com/svrnewton/svrn








This page was built for publication: Stochastic Variance-Reduced Newton: Accelerating Finite-Sum Minimization with Large Batches

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6401269)