Accelerating gradient descent and Adam via fractional gradients
From MaRDI portal
Publication:6057934
DOI10.1016/J.NEUNET.2023.01.002OpenAlexW4315644348MaRDI QIDQ6057934
George Em. Karniadakis, Jérôme Darbon, Yeonjong Shin
Publication date: 26 October 2023
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2023.01.002
Related Items (1)
Cites Work
- Localization of nonlocal gradients in various topologies
- Fractional vector calculus and fractional Maxwell's equations
- Introductory lectures on convex optimization. A basic course.
- Study on fractional order gradient methods
- Cauchy and the gradient method
- Towards a unified theory of fractional and nonlocal vector calculus
- Fractional-order gradient descent learning of BP neural networks with Caputo derivative
- Fractional differential equation approach for convex optimization with convergence rate analysis
- Generalization of the gradient method with fractional order gradient direction
- Tikhonov Regularization and Total Least Squares
- Numerical methods for nonlocal and fractional models
- Numerical optimization. Theoretical and practical aspects. Transl. from the French
- Unnamed Item
- Unnamed Item
This page was built for publication: Accelerating gradient descent and Adam via fractional gradients