Hybrid tensor decomposition in neural network compression
From MaRDI portal
Publication:2057771
DOI10.1016/j.neunet.2020.09.006zbMath1475.68325arXiv2006.15938OpenAlexW3037399553WikidataQ99724362 ScholiaQ99724362MaRDI QIDQ2057771
Bijiao Wu, Guangshe Zhao, Lei Deng, Dingheng Wang, Guoqi Li
Publication date: 7 December 2021
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2006.15938
balanced structurehierarchical Tuckertensor-trainhybrid tensor decompositionneural network compression
Artificial neural networks and deep learning (68T07) Factorization of matrices (15A23) Multilinear algebra, tensor calculus (15A69)
Related Items (1)
Uses Software
Cites Work
- Tensor-Train Decomposition
- Optimization problems in contracted tensor networks
- A new scheme for the tensor representation
- An introduction to hierarchical (\(\mathcal H\)-) rank and TT-rank of tensors with examples
- Analysis of individual differences in multidimensional scaling via an \(n\)-way generalization of ``Eckart-Young decomposition
- Regularized Computation of Approximate Pseudoinverse of Large Matrices Using Low-Rank Tensor Train Decompositions
- Hierarchical Singular Value Decomposition of Tensors
- Algorithm 941
- Decompositions of a Higher-Order Tensor in Block Terms—Part II: Definitions and Uniqueness
- Preconditioned Low-Rank Methods for High-Dimensional Elliptic PDE Eigenvalue Problems
This page was built for publication: Hybrid tensor decomposition in neural network compression