On accelerating the regularized alternating least-squares algorithm for tensors
From MaRDI portal
Publication:1744316
DOI10.1553/etna_vol48s1zbMath1391.65113arXiv1507.04721OpenAlexW2963772710WikidataQ114052068 ScholiaQ114052068MaRDI QIDQ1744316
Stefan Kindermann, Carmeliza Navasca, Xiao-Fei Wang
Publication date: 23 April 2018
Published in: ETNA. Electronic Transactions on Numerical Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1507.04721
Related Items
SOTT: Greedy Approximation of a Tensor as a Sum of Tensor Trains, A self-adaptive regularized alternating least squares method for tensor decomposition problems
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Tensor Decompositions and Applications
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- The Steffensen iteration method for systems of nonlinear equations
- Introductory lectures on convex optimization. A basic course.
- Some convergence results on the regularized alternating least-squares method for tensor decomposition
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Some applications of the Łojasiewicz gradient inequality
- Analysis of individual differences in multidimensional scaling via an \(n\)-way generalization of ``Eckart-Young decomposition
- On Local Convergence of Alternating Schemes for Optimization of Convex Problems in the Tensor Train Format
- A Block Coordinate Descent Method for Regularized Multiconvex Optimization with Applications to Nonnegative Tensor Factorization and Completion
- Maximum Block Improvement and Polynomial Optimization
- The Alternating Linear Scheme for Tensor Optimization in the Tensor Train Format
- Local Convergence of the Alternating Least Squares Algorithm for Canonical Tensor Approximation
- On Convergence of the Maximum Block Improvement Method
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- On the Convergence of Iterative Methods for Semidefinite Linear Systems
- Tensor Rank and the Ill-Posedness of the Best Low-Rank Approximation Problem
- Convergence of the Iterates of Descent Methods for Analytic Cost Functions