Stack operation of tensor networks

From MaRDI portal
Publication:6395153

arXiv2203.16338MaRDI QIDQ6395153

Author name not available (Why is that?)

Publication date: 28 March 2022

Abstract: The tensor network, as a facterization of tensors, aims at performing the operations that are common for normal tensors, such as addition, contraction and stacking. However, due to its non-unique network structure, only the tensor network contraction is so far well defined. In this paper, we propose a mathematically rigorous definition for the tensor network stack approach, that compress a large amount of tensor networks into a single one without changing their structures and configurations. We illustrate the main ideas with the matrix product states based machine learning as an example. Our results are compared with the for loop and the efficient coding method on both CPU and GPU.




Has companion code repository: https://github.com/veya2ztn/stack_of_tensor_network








This page was built for publication: Stack operation of tensor networks

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6395153)