Scheduled Restart Momentum for Accelerated Stochastic Gradient Descent
From MaRDI portal
Publication:5094616
DOI10.1137/21M1453311zbMath1492.90132arXiv2002.10583OpenAlexW3007093918MaRDI QIDQ5094616
No author found.
Publication date: 4 August 2022
Published in: SIAM Journal on Imaging Sciences (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2002.10583
Artificial neural networks and deep learning (68T07) Convex programming (90C25) Stochastic learning and adaptive control (93E35) Acceleration of convergence in numerical analysis (65B99)
Related Items (3)
Adaptive and Implicit Regularization for Matrix Completion ⋮ Learning proper orthogonal decomposition of complex dynamics using heavy-ball neural ODEs ⋮ How does momentum benefit deep neural networks architecture design? A few case studies
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- Gradient methods for minimizing composite functions
- First-order methods of smooth convex optimization with inexact oracle
- An optimal method for stochastic composite optimization
- Introductory lectures on convex optimization. A basic course.
- Adaptive restart for accelerated gradient schemes
- Optimal methods of smooth convex minimization
- Optimization Methods for Large-Scale Machine Learning
- Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
- Some methods of speeding up the convergence of iteration methods
- Convex Analysis
- Robust Accelerated Gradient Methods for Smooth Strongly Convex Functions
This page was built for publication: Scheduled Restart Momentum for Accelerated Stochastic Gradient Descent