Automatic Differentiation for Riemannian Optimization on Low-Rank Matrix and Tensor-Train Manifolds
From MaRDI portal
Publication:5094222
DOI10.1137/20M1356774zbMath1495.65083arXiv2103.14974OpenAlexW3148684197WikidataQ114074168 ScholiaQ114074168MaRDI QIDQ5094222
No author found.
Publication date: 2 August 2022
Published in: SIAM Journal on Scientific Computing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2103.14974
Numerical mathematical programming methods (65K05) Multilinear algebra, tensor calculus (15A69) Methods of local Riemannian geometry (53B21)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Tensor-Train Decomposition
- Low-rank retractions: a survey and new results
- Low-rank Kronecker-product approximation to multi-dimensional nonlocal operators I. Separable approximation of multi-variate functions
- Tensor-structured preconditioners and approximate inverse of elliptic operators in \(\mathbb R^{d}\)
- Stability of low-rank tensor representations and structured multilevel preconditioning for elliptic PDEs
- Low-rank Riemannian eigensolver for high-dimensional Hamiltonians
- On manifolds of tensors of fixed TT-rank
- Riemannian Optimization for High-Dimensional Tensor Completion
- Pymanopt: A Python Toolbox for Optimization on Manifolds using Automatic Differentiation
- Low-Rank Matrix Completion by Riemannian Optimization
- A Riemannian Optimization Approach for Computing Low-Rank Solutions of Lyapunov Equations
- Evaluating Derivatives
- Jacobi--Davidson Method on Low-Rank Matrix Manifolds
- Scientific Computing - An Introduction using Maple and MATLAB
- An Extrinsic Look at the Riemannian Hessian
- Preconditioned Low-rank Riemannian Optimization for Linear Systems with Tensor Product Structure
This page was built for publication: Automatic Differentiation for Riemannian Optimization on Low-Rank Matrix and Tensor-Train Manifolds