Inexact SARAH algorithm for stochastic optimization
From MaRDI portal
Publication:5859016
DOI10.1080/10556788.2020.1818081zbMath1464.90045arXiv1811.10105OpenAlexW3086762633MaRDI QIDQ5859016
Katya Scheinberg, Lam M. Nguyen, Martin Takáč
Publication date: 15 April 2021
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1811.10105
Related Items (5)
Finite-sum smooth optimization with SARAH ⋮ Random-reshuffled SARAH does not need full gradient computations ⋮ Accelerated stochastic variance reduction for a class of convex optimization problems ⋮ Fast Decentralized Nonconvex Finite-Sum Optimization with Recursive Variance Reduction ⋮ A hybrid stochastic optimization framework for composite nonconvex optimization
Uses Software
Cites Work
- Unnamed Item
- Minimizing finite sums with the stochastic average gradient
- Introductory lectures on convex optimization. A basic course.
- Cubic regularization of Newton method and its global performance
- Semi-stochastic coordinate descent
- A Stochastic Line Search Method with Expected Complexity Analysis
- Some methods of speeding up the convergence of iteration methods
This page was built for publication: Inexact SARAH algorithm for stochastic optimization