Practical Leverage-Based Sampling for Low-Rank Tensor Decomposition
From MaRDI portal
Publication:5099421
DOI10.1137/21M1441754OpenAlexW3039567237WikidataQ114073980 ScholiaQ114073980MaRDI QIDQ5099421
Brett W. Larsen, Tamara G. Kolda
Publication date: 31 August 2022
Published in: SIAM Journal on Matrix Analysis and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2006.16438
tensor decompositionmatrix sketchingrandomized numerical linear algebracanonical polyadicleverage score samplingRandNLA
Multilinear algebra, tensor calculus (15A69) Numerical linear algebra (65F99) Numerical methods for low-rank matrix approximation; matrix compression (65F55)
Related Items (2)
A Higher-Order Generalized Singular Value Decomposition for Rank-Deficient Matrices ⋮ A block-randomized stochastic method with importance sampling for CP tensor decomposition
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Tensor Decompositions and Applications
- Faster least squares approximation
- A randomized algorithm for a tensor-based generalization of the singular value decomposition
- Simple key enumeration (and rank estimation) using histograms: an integrated approach
- Guarantees for the Kronecker fast Johnson-Lindenstrauss transform using a coherence and sampling argument
- Analysis of individual differences in multidimensional scaling via an \(n\)-way generalization of ``Eckart-Young decomposition
- Counting Keys in Parallel After a Side Channel Attack
- Computational Advertising: Techniques for Targeting Relevant Ads
- Simpler and More Efficient Rank Estimation for Side-Channel Security Assessment
- Randomized Algorithms for Matrices and Data
- Efficient MATLAB Computations with Sparse and Factored Tensors
- Tensor Decomposition for Signal Processing and Machine Learning
- A Practical Randomized CP Tensor Decomposition
- ISLET: Fast and Optimal Low-Rank Tensor Regression via Importance Sketching
- Faster Johnson–Lindenstrauss transforms via Kronecker products
- Stochastic Gradients for Large-Scale Tensor Decomposition
- Low-Rank Tucker Approximation of a Tensor from Streaming Data
- Relative Error Tensor Low Rank Approximation
- Tensor-CUR Decompositions for Tensor-Based Data
- Fast Monte Carlo Algorithms for Matrices I: Approximating Matrix Multiplication
- Lower Memory Oblivious (Tensor) Subspace Embeddings with Fewer Random Bits: Modewise Methods for Least Squares
- Randomized numerical linear algebra: Foundations and algorithms
This page was built for publication: Practical Leverage-Based Sampling for Low-Rank Tensor Decomposition