Multilinear Compressive Sensing and an Application to Convolutional Linear Networks
From MaRDI portal
Publication:5025785
DOI10.1137/18M119834XMaRDI QIDQ5025785
François Malgouyres, Joseph M. Landsberg
Publication date: 3 February 2022
Published in: SIAM Journal on Mathematics of Data Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1703.08044
Related Items (5)
Efficient Identification of Butterfly Sparse Matrix Factorizations ⋮ A survey on deep matrix factorizations ⋮ Spherical Image Inpainting with Frame Transformation and Data-Driven Prior Deep Networks ⋮ Parameter identifiability of a deep feedforward ReLU neural network ⋮ An embedding of ReLU networks and an analysis of their identifiability
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Witness sets of projections
- On the identifiability of overcomplete dictionaries via the minimisation principle underlying K-SVD
- Toward fast transform learning
- Nonparametric regression using deep neural networks with ReLU activation function
- Rapid, robust, and reliable blind deconvolution via nonconvex optimization
- Membership tests for images of algebraic sets by linear projections
- Exact matrix completion via convex optimization
- PhaseLift: Exact and Stable Signal Recovery from Magnitude Measurements via Convex Programming
- Sparse Signal Recovery from Quadratic Measurements via Convex Programming
- Sparse and Spurious: Dictionary Learning With Noise and Outliers
- Sample Complexity of Dictionary Learning and Other Matrix Factorizations
- A Clustering Approach to Learning Sparsely Used Overcomplete Dictionaries
- Fundamental Performance Limits for Ideal Decoders in High-Dimensional Linear Inverse Problems
- Blind Deconvolution Using Convex Programming
- Compressed sensing and best 𝑘-term approximation
- Lifting for Blind Deconvolution in Random Mask Imaging: Identifiability and Convex Relaxation
- Self-calibration and biconvex compressive sensing
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- On the Complexity of Nonnegative Matrix Factorization
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- Universal approximation bounds for superpositions of a sigmoidal function
- On Convergence of Kronecker Graphical Lasso Algorithms
- Spurious Valleys in Two-layer Neural Network Optimization Landscapes
- Dictionary Identification—Sparse Matrix-Factorization via $\ell_1$-Minimization
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- Basic Algebraic Geometry 1
- Learning the parts of objects by non-negative matrix factorization
- Computing a nonnegative matrix factorization -- provably
- The Rotation of Eigenvectors by a Perturbation. III
- Phase Retrieval via Matrix Completion
- Compressed sensing
This page was built for publication: Multilinear Compressive Sensing and an Application to Convolutional Linear Networks