Alternating Least Squares as Moving Subspace Correction
From MaRDI portal
Publication:4562238
DOI10.1137/17M1148712zbMath1417.65134arXiv1709.07286OpenAlexW2962685086MaRDI QIDQ4562238
M. V. Rakhuba, Ivan V. Oseledets, André Uschmajew
Publication date: 19 December 2018
Published in: SIAM Journal on Numerical Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1709.07286
Numerical mathematical programming methods (65K05) Iterative numerical methods for linear systems (65F10) Multilinear algebra, tensor calculus (15A69)
Related Items
Numerical approximation of Poisson problems in long domains, SOTT: Greedy Approximation of a Tensor as a Sum of Tensor Trains, Low-rank tensor methods for partial differential equations, Computation of the self-diffusion coefficient with low-rank tensor methods: application to the simulation of a cross-diffusion system, Tensor approximation of the self-diffusion matrix of tagged particle processes, On global convergence of alternating least squares for tensor approximation
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Das Verfahren der Treppeniteration und verwandte Verfahren zur Lösung algebraischer Eigenwertprobleme
- Low rank matrix completion by alternating steepest descent methods
- Musings on multilinear fitting
- Rank-One Approximation to High Order Tensors
- On Local Convergence of Alternating Schemes for Optimization of Convex Problems in the Tensor Train Format
- Low-Rank Matrix Completion by Riemannian Optimization
- Local Convergence of the Alternating Least Squares Algorithm for Canonical Tensor Approximation
- On the Global Convergence of the Alternating Least Squares Method for Rank-One Approximation to Generic Tensors
- A new convergence proof for the higher-order power method and generalizations
- Convergence Results for Projected Line-Search Methods on Varieties of Low-Rank Matrices Via Łojasiewicz Inequality
- The method of alternating projections and the method of subspace corrections in Hilbert space
- The Differentiation of Pseudo-Inverses and Nonlinear Least Squares Problems Whose Variables Separate
- Preconditioned Low-rank Riemannian Optimization for Linear Systems with Tensor Product Structure