Riemannian Stochastic Variance Reduced Gradient Algorithm with Retraction and Vector Transport
DOI10.1137/17M1116787zbMath1421.90084arXiv1702.05594WikidataQ115246947 ScholiaQ115246947MaRDI QIDQ5231672
Hiroyuki Sato, Bamdev Mishra, Hiroyuki Kasai
Publication date: 27 August 2019
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1702.05594
matrix completionprincipal component analysisretractionRiemannian optimizationvector transportRiemannian centroidstochastic variance reduced gradient
Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Stochastic programming (90C15)
Related Items (21)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Low-rank tensor completion by Riemannian optimization
- A Riemannian symmetric rank-one trust-region method
- A survey and comparison of contemporary algorithms for computing the matrix geometric mean
- A Riemannian framework for tensor computing
- Riemannian Preconditioning
- Manopt, a Matlab toolbox for optimization on manifolds
- A Broyden Class of Quasi-Newton Methods for Riemannian Optimization
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- Incremental Majorization-Minimization Optimization with Application to Large-Scale Machine Learning
- Quasi-Martingales
- Stochastic Gradient Descent on Riemannian Manifolds
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
This page was built for publication: Riemannian Stochastic Variance Reduced Gradient Algorithm with Retraction and Vector Transport