A novel stepsize for gradient descent method
From MaRDI portal
Publication:6564290
DOI10.1016/J.ORL.2024.107072MaRDI QIDQ6564290
Pham Thi Hoai, Nguyen The Vinh, Nguyen Phung Hai Chung
Publication date: 1 July 2024
Published in: Operations Research Letters (Search for Journal in Brave)
convex programmingnonlinear programmingprojected gradient methodgradient descent methodconstrained optimization problem
Cites Work
- Title not available (Why is that?)
- Lectures on convex optimization
- Cauchy's method of minimization
- Cauchy and the gradient method
- Stochastic gradient descent with Polyak's learning rate
- Some modified fast iterative shrinkage thresholding algorithms with a new adaptive non-monotone stepsize strategy for nonsmooth and convex minimization problems
- A non-Euclidean gradient descent method with sketching for unconstrained matrix minimization
- Polyak's gradient method for split feasibility problem constrained by level sets
- Cubic regularization of Newton method and its global performance
- Minimization of functions having Lipschitz continuous first partial derivatives
- Nonlinear programming
- Introduction to Nonlinear Optimization
- Matrix and Tensor Factorization Techniques for Recommender Systems
- Two-Point Step Size Gradient Methods
- Stabilized Barzilai-Borwein Method
- The Variable Metric Forward-Backward Splitting Algorithm Under Mild Differentiability Assumptions
- Foundations of Optimization
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
This page was built for publication: A novel stepsize for gradient descent method
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6564290)