Convergence rate of descent method with new inexact line-search on Riemannian manifolds
From MaRDI portal
Publication:1730772
DOI10.1007/s10957-018-1390-6zbMath1407.65065OpenAlexW2893613551WikidataQ115382550 ScholiaQ115382550MaRDI QIDQ1730772
Qamrul Hasan Ansari, Jen-Chih Yao, Xiao-bo Li, Nan-Jing Huang
Publication date: 6 March 2019
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10957-018-1390-6
Numerical mathematical programming methods (65K05) Variational inequalities (49J40) Programming in abstract spaces (90C48) Variational inequalities (global problems) in infinite-dimensional spaces (58E35)
Related Items
Tseng's extragradient algorithm for pseudomonotone variational inequalities on Hadamard manifolds ⋮ Convergence of the Gauss-Newton method for convex composite optimization problems under majorant condition on Riemannian manifolds ⋮ An explicit extragradient algorithm for equilibrium problems on Hadamard manifolds ⋮ A nonmonotone trust region method for unconstrained optimization problems on Riemannian manifolds ⋮ Parallel proximal point methods for systems of vector optimization problems on Hadamard manifolds without convexity ⋮ Global convergence of Riemannian line search methods with a Zhang-Hager-type condition ⋮ Levitin-Polyak well-posedness by perturbations for the split hemivariational inequality problem on Hadamard manifolds
Cites Work
- Gap functions and global error bounds for generalized mixed variational inequalities on Hadamard manifolds
- Existence of solutions for variational inequalities on Riemannian manifolds
- Convergence of descent method with new line search
- Convergence of quasi-Newton method with new inexact line search
- Convergence of the Newton method and uniqueness of zeros of vector fields on Riemannian manifolds
- Newton's method for sections on Riemannian manifolds: Generalized covariant \(\alpha \)-theory
- Globally convergent optimization algorithms on Riemannian manifolds: Uniform framework for unconstrained and constrained optimization
- Smooth nonlinear optimization of \(\mathbb R^n\)
- Kantorovich's theorem on Newton's method in Riemannian manifolds
- A Riemannian symmetric rank-one trust-region method
- Trust-region methods on Riemannian manifolds
- Minimization of functions having Lipschitz continuous first partial derivatives
- Optimization Methods on Riemannian Manifolds and Their Application to Shape Space
- A Broyden Class of Quasi-Newton Methods for Riemannian Optimization
- Variational Inequalities for Set-Valued Vector Fields on Riemannian Manifolds: Convexity of the Solution Set and the Proximal Point Algorithm
- Newton's method on Riemannian manifolds and a geometric model for the human spine
- Weak Sharp Minima on Riemannian Manifolds
- Numerical Optimization
- Optimization Techniques on Riemannian Manifolds
- Newton's method on Riemannian manifolds: covariant alpha theory
- Convergence Conditions for Ascent Methods
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item