ISLET: Fast and Optimal Low-Rank Tensor Regression via Importance Sketching
From MaRDI portal
Publication:5027035
DOI10.1137/19M126476XzbMath1484.65095arXiv1911.03804OpenAlexW3033283747MaRDI QIDQ5027035
Ming Yuan, Garvesh Raskutti, Yuetian Luo, Anru R. Zhang
Publication date: 3 February 2022
Published in: SIAM Journal on Mathematics of Data Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1911.03804
Estimation in multivariate analysis (62H12) Multilinear algebra, tensor calculus (15A69) Numerical linear algebra (65F99) Numerical methods for low-rank matrix approximation; matrix compression (65F55)
Related Items
An optimal statistical and computational framework for generalized tensor estimation, Inference for low-rank tensors -- no need to debias, Practical Leverage-Based Sampling for Low-Rank Tensor Decomposition, Generalized Low-Rank Plus Sparse Tensor Estimation by Fast Riemannian Optimization, Optimality conditions for Tucker low-rank tensor optimization, Unnamed Item, Bridging and Improving Theoretical and Computational Electrical Impedance Tomography via Data Completion
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Tensor Decompositions and Applications
- Tensor-Train Decomposition
- Covariate-Adjusted Tensor Classification in High-Dimensions
- On tensor completion via nuclear norm minimization
- Optimal rates of convergence for noisy sparse phase retrieval via thresholded Wirtinger flow
- Oracle inequalities and optimal inference under group sparsity
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- Adaptive cross approximation of multivariate functions
- Isogeometric analysis: CAD, finite elements, NURBS, exact geometry and mesh refinement
- Multilinear tensor regression for longitudinal relational data
- Generalizing the column-row matrix decomposition to multi-way arrays
- Differential-geometric Newton method for the best rank-\((R _{1}, R _{2}, R _{3})\) approximation of tensors
- Sketching meets random projection in the dual: a provable recovery algorithm for big and high-dimensional data
- Cross: efficient low-rank tensor completion
- Finding frequent items in data streams
- Maximum likelihood estimation for the tensor normal distribution: Algorithm, minimum sample size, and empirical bias and dispersion
- Krylov-type methods for tensor computations.I
- Variational calculus with sums of elementary tensors of fixed rank
- Multilayer tensor factorization with applications to recommender systems
- Greedy low-rank approximation in Tucker format of solutions of tensor linear systems
- On polynomial time methods for exact low-rank tensor completion
- Gemini: graph estimation with matrix variate normal instances
- A new scheme for the tensor representation
- Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach
- A black-box low-rank approximation algorithm for fast matrix assembly in isogeometric analysis
- Generalized high-dimensional trace regression via nuclear norm regularization
- ROP: matrix recovery via rank-one projections
- Convex regularization for high-dimensional multiresponse tensor regression
- Graphical model selection and estimation for high dimensional tensor data
- Iterative Hessian sketch: Fast and accurate solution approximation for constrained least-squares
- A literature survey of low-rank tensor approximation techniques
- Wedderburn Rank Reduction and Krylov Subspace Method for Tensor Approximation. Part 1: Tucker Case
- Tensor decompositions for learning latent variable models
- Computational Advertising: Techniques for Targeting Relevant Ads
- Optimal CUR Matrix Decompositions
- Randomized Sketches of Convex Programs With Sharp Guarantees
- Exact and Stable Covariance Estimation From Quadratic Sampling via Convex Programming
- Phase Retrieval via Wirtinger Flow: Theory and Algorithms
- Sketching Sparse Matrices, Covariances, and Graphs via Tensor Products
- Random Projections for Classification: A Recovery Approach
- Best Low Multilinear Rank Approximation of Higher-Order Tensors, Based on the Riemannian Trust-Region Scheme
- Sure independence screening and compressed random sensing
- Krylov Subspace Methods for Linear Systems with Tensor Product Structure
- Hierarchical Singular Value Decomposition of Tensors
- A Singular Value Thresholding Algorithm for Matrix Completion
- Cross approximation in tensor electron density computations
- Randomized Algorithms for Matrices and Data
- Linear systems with a canonical polyadic decomposition constrained solution: Algorithms and applications
- Breaking the Curse of Dimensionality, Or How to Use SVD in Many Dimensions
- Low-Rank Approximation and Regression in Input Sparsity Time
- Sparser Johnson-Lindenstrauss Transforms
- Decoding by Linear Programming
- A Newton–Grassmann Method for Computing the Best Multilinear Rank-$(r_1,$ $r_2,$ $r_3)$ Approximation of a Tensor
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- On the Best Rank-1 and Rank-(R1 ,R2 ,. . .,RN) Approximation of Higher-Order Tensors
- Optimal Algorithms for <formula formulatype="inline"> <tex Notation="TeX">$L_{1}$</tex></formula>-subspace Signal Processing
- Practical Sketching Algorithms for Low-Rank Matrix Approximation
- Efficient L1-Norm Principal-Component Analysis via Bit Flipping
- Tensor Decomposition for Signal Processing and Machine Learning
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Spectral Algorithms for Tensor Completion
- Low rank approximation with entrywise l 1 -norm error
- Why Are Big Data Matrices Approximately Low Rank?
- Sparse and Low-Rank Tensor Estimation via Cubic Sketchings
- Quasi-Newton Methods on Grassmannians and Multilinear Approximations of Tensors
- Optimal Sparse Singular Value Decomposition for High-Dimensional High-Order Data
- A PTAS for ℓp-Low Rank Approximation
- Relative Error Tensor Low Rank Approximation
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- Tucker Dimensionality Reduction of Three-Dimensional Arrays in Linear Time
- Tensor-CUR Decompositions for Tensor-Based Data
- Tensor Regression with Applications in Neuroimaging Data Analysis
- Tensor product analysis of partial difference equations
- Tensor Learning for Regression
- A projection method to solve linear systems in tensor format
- Model Selection and Estimation in Regression with Grouped Variables
- Applied Multiway Data Analysis
- Low-distortion subspace embeddings in input-sparsity time and applications to robust linear regression
- Algorithms for Numerical Analysis in High Dimensions
- Preconditioned Low-rank Riemannian Optimization for Linear Systems with Tensor Product Structure
- Compressed matrix multiplication
- A fast unified algorithm for solving group-lasso penalize learning problems