Lower Memory Oblivious (Tensor) Subspace Embeddings with Fewer Random Bits: Modewise Methods for Least Squares
DOI10.1137/19M1308116OpenAlexW3137806381MaRDI QIDQ5857850
Ali Zare, Deanna Needell, Elizaveta Rebrova, Mark A. Iwen
Publication date: 8 April 2021
Published in: SIAM Journal on Matrix Analysis and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1912.08294
dimensionality reductionleast squares fittingtensorsJohnson-Lindenstrauss embeddingsfast approximation algorithmslow-rank tensorsCP decompositions
Probability in computer science (algorithm analysis, random structures, phase transitions, etc.) (68Q87) Numerical methods for low-rank matrix approximation; matrix compression (65F55)
Related Items (5)
Uses Software
Cites Work
- Tensor Decompositions and Applications
- A mathematical introduction to compressive sensing
- From quantum to classical molecular dynamics: Reduced models and numerical analysis.
- Combinatorial sublinear-time Fourier algorithms
- A simple proof of the restricted isometry property for random matrices
- Database-friendly random projections: Johnson-Lindenstrauss with binary coins.
- Improved approximation guarantees for sublinear-time Fourier algorithms
- Finding frequent items in data streams
- A deterministic sparse FFT for functions with structured Fourier sparsity
- Guarantees for the Kronecker fast Johnson-Lindenstrauss transform using a coherence and sampling argument
- Concentration inequalities for random tensors
- Compressed sensing with sparse binary matrices: instance optimal error guarantees in near-optimal time
- Improved sparse Fourier approximation results: Faster implementations and stronger guarantees
- Low rank tensor recovery via iterative hard thresholding
- A new class of fully discrete sparse Fourier transforms: faster stable implementations with guarantees
- Analysis of individual differences in multidimensional scaling via an \(n\)-way generalization of ``Eckart-Young decomposition
- Exact matrix completion via convex optimization
- Orthogonal Tensor Decompositions
- A Distributed and Incremental SVD Algorithm for Agglomerative Data Analysis on Large Networks
- A sparse Johnson
- A New Truncation Strategy for the Higher-Order Singular Value Decomposition
- Tensor decompositions for learning latent variable models
- Compressive Multiplexing of Correlated Signals
- Robust principal component analysis?
- New and Improved Johnson–Lindenstrauss Embeddings via the Restricted Isometry Property
- Sparser Johnson-Lindenstrauss Transforms
- Extensions of Lipschitz mappings into a Hilbert space
- Decoding by Linear Programming
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- Efficient Tensor Completion for Color Image and Video Recovery: Low-Rank Tensor Train
- High-Dimensional Probability
- A Practical Randomized CP Tensor Decomposition
- Oblivious Sketching of High-Degree Polynomial Kernels
- Tensor Rank and the Ill-Posedness of the Best Low-Rank Approximation Problem
- Compressed matrix multiplication
This page was built for publication: Lower Memory Oblivious (Tensor) Subspace Embeddings with Fewer Random Bits: Modewise Methods for Least Squares