A multivariate adaptive gradient algorithm with reduced tuning efforts
From MaRDI portal
Publication:6488713
DOI10.1016/J.NEUNET.2022.05.016MaRDI QIDQ6488713
Samer Saab, Khaled Saab, Minghui Zhu, Asok Kumar Ray, Shashi Phoha
Publication date: 17 October 2023
Published in: Neural Networks (Search for Journal in Brave)
Cites Work
- Unnamed Item
- Unnamed Item
- A literature survey of benchmark functions for global optimisation problems
- Introductory lectures on convex optimization. A basic course.
- Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints
- A New Class of Incremental Gradient Methods for Least Squares Problems
- Batched Stochastic Gradient Descent with Weighted Sampling
- Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
- A Stochastic Approximation Method
- Stochastic gradient descent, weighted sampling, and the randomized Kaczmarz algorithm
- Smoothing neural network for \(L_0\) regularized optimization problem with general convex constraints
- Convergence of the RMSProp deep learning method with penalty for nonconvex optimization
This page was built for publication: A multivariate adaptive gradient algorithm with reduced tuning efforts