Global convergence of Hager-Zhang type Riemannian conjugate gradient method
From MaRDI portal
Publication:2101972
DOI10.1016/j.amc.2022.127685OpenAlexW4284678912MaRDI QIDQ2101972
Hiroyuki Sakai, Hideaki Iiduka, Hiroyuki Sato
Publication date: 7 December 2022
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2207.01855
Mathematical programming (90Cxx) Numerical methods for mathematical programming, optimization and variational techniques (65Kxx) Operations research, mathematical programming (90-XX)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A Dai-Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions
- Hybrid Riemannian conjugate gradient methods with global convergence properties
- Sufficient descent Riemannian conjugate gradient methods
- Optimization Methods on Riemannian Manifolds and Their Application to Shape Space
- Optimization Techniques on Riemannian Manifolds
- Riemannian Conjugate Gradient Methods: General Framework and Specific Algorithms with Convergence Analyses
- Riemannian Optimization and Its Applications
- A new, globally convergent Riemannian conjugate gradient method
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Maxima for Graphs and a New Proof of a Theorem of Turán
- Benchmarking optimization software with performance profiles.
This page was built for publication: Global convergence of Hager-Zhang type Riemannian conjugate gradient method