New gradient methods with adaptive stepsizes by approximate models
From MaRDI portal
Publication:6611221
DOI10.1080/02331934.2023.2234925MaRDI QIDQ6611221
Zexian Liu, Hong-Wei Liu, Unnamed Author
Publication date: 26 September 2024
Published in: Optimization (Search for Journal in Brave)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Barzilai-Borwein conjugate gradient method
- Scaling on the spectral gradient method
- An efficient gradient method using the Yuan steplength
- Gradient methods with adaptive step-sizes
- Notes on the Dai-Yuan-Yuan modified spectral gradient method
- New adaptive stepsize selections in gradient methods
- Multi-step quasi-Newton methods for optimization
- Modified two-point stepsize gradient methods for unconstrained optimization
- An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization
- Analysis of monotone gradient methods
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization
- An efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization
- An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion
- R-linear convergence of the Barzilai and Borwein gradient method
- On spectral properties of steepest descent methods
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- A Modified BFGS Algorithm for Unconstrained Optimization
- Linear Convergence of Subgradient Algorithm for Convex Feasibility on Riemannian Manifolds
- Algorithm 851
- Two-Point Step Size Gradient Methods
- Self-Scaling Variable Metric (SSVM) Algorithms
- Self-Scaling Variable Metric (SSVM) Algorithms
- Gradient Method with Retards and Generalizations
- Alternate minimization gradient method
- Subgradient algorithms on Riemannian manifolds of lower bounded curvatures
- An efficient gradient method with approximate optimal stepsize for the strictly convex quadratic minimization problem
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Nonmonotone Line Search Technique for Newton’s Method
- Equipping the Barzilai--Borwein Method with the Two Dimensional Quadratic Termination Property
- On the Barzilai and Borwein choice of steplength for the gradient method
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Benchmarking optimization software with performance profiles.
This page was built for publication: New gradient methods with adaptive stepsizes by approximate models