Tensor-Structured Sketching for Constrained Least Squares
From MaRDI portal
Publication:5021024
DOI10.1137/20M1374596zbMath1493.62639arXiv2010.09791MaRDI QIDQ5021024
Publication date: 11 January 2022
Published in: SIAM Journal on Matrix Analysis and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2010.09791
Random matrices (probabilistic aspects) (60B20) Random matrices (algebraic aspects) (15B52) Convexity and finite-dimensional Banach spaces (including special norms, zonoids, etc.) (aspects of convex geometry) (52A21) Direct numerical methods for linear systems and matrix inversion (65F05) Statistical aspects of big data and data science (62R07)
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Tensor Decompositions and Applications
- Concentration inequalities for non-Lipschitz functions with bounded derivatives of higher order
- Toward a unified theory of sparse dimensionality reduction in Euclidean space
- Estimates of moments and tails of Gaussian chaoses
- Multilinear tensor regression for longitudinal relational data
- Some inequalities for Gaussian processes and applications
- Finding frequent items in data streams
- Concentration inequalities for polynomials in \(\alpha\)-sub-exponential random variables
- Guarantees for the Kronecker fast Johnson-Lindenstrauss transform using a coherence and sampling argument
- Squared-norm empirical processes
- Reconstruction and subgaussian operators in asymptotic geometric analysis
- Majorizing measures: The generic chaining
- Iterative Hessian sketch: Fast and accurate solution approximation for constrained least-squares
- A sparse Johnson
- Approximate nearest neighbors and the fast Johnson-Lindenstrauss transform
- Tensor decompositions for learning latent variable models
- Computational Advertising: Techniques for Targeting Relevant Ads
- Randomized Sketches of Convex Programs With Sharp Guarantees
- A fast randomized algorithm for overdetermined linear least-squares regression
- IMPROVED ANALYSIS OF THE SUBSAMPLED RANDOMIZED HADAMARD TRANSFORM
- Low-Rank Approximation and Regression in Input Sparsity Time
- Extensions of Lipschitz mappings into a Hilbert space
- On sparse reconstruction from Fourier and Gaussian measurements
- Sampling algorithms for l2 regression and applications
- A Multilinear Singular Value Decomposition
- Optimization Methods for Large-Scale Machine Learning
- High-Dimensional Probability
- A Practical Randomized CP Tensor Decomposition
- The Generic Chaining
- Structured Random Sketching for PDE Inverse Problems
- Sparse Sampling for Inverse Problems With Tensors
- Low-distortion subspace embeddings in input-sparsity time and applications to robust linear regression
- Compressed matrix multiplication
This page was built for publication: Tensor-Structured Sketching for Constrained Least Squares