Low-rank tensor completion by Riemannian optimization
DOI10.1007/s10543-013-0455-zzbMath1300.65040arXiv1508.02988OpenAlexW2081962379WikidataQ115384194 ScholiaQ115384194MaRDI QIDQ398628
Michael Steinlechner, Daniel Kressner, Bart Vandereycken
Publication date: 15 August 2014
Published in: BIT (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1508.02988
algorithmnumerical resultsreconstructionnonlinear conjugate gradient methodlow-rank approximationhigh-dimensionalityRiemannian optimizationTucker decompositiontensor completion problem
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Vector and tensor algebra, theory of invariants (15A72) Matrix completion problems (15A83)
Related Items (64)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Tensor Decompositions and Applications
- The geometry of algorithms using hierarchical tensors
- Learning with tensors: a framework based on convex optimization and spectral regularization
- Fixed-rank matrix factorizations and Riemannian low-rank optimization
- Low rank tensor recovery via iterative hard thresholding
- Smoothness and Periodicity of Some Matrix Decompositions
- Low-Rank Matrix Completion by Riemannian Optimization
- A literature survey of low-rank tensor approximation techniques
- Projection-like Retractions on Matrix Manifolds
- Dynamical Tensor Approximation
- Tensor completion and low-n-rank tensor recovery via convex optimization
- Sparse tensor discretizations of high-dimensional parametric and stochastic PDEs
- A Multilinear Singular Value Decomposition
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
This page was built for publication: Low-rank tensor completion by Riemannian optimization