Progressive Transmission using Recurrent Neural Networks
From MaRDI portal
Publication:6374437
arXiv2108.01643MaRDI QIDQ6374437
Author name not available (Why is that?)
Publication date: 3 August 2021
Abstract: In this paper, we investigate a new machine learning-based transmission strategy called progressive transmission or ProgTr. In ProgTr, there are b variables that should be transmitted using at most T channel uses. The transmitter aims to send the data to the receiver as fast as possible and with as few channel uses as possible (as channel conditions permit) while the receiver refines its estimate after each channel use. We use recurrent neural networks as the building block of both the transmitter and receiver where the SNR is provided as an input that represents the channel conditions. To show how ProgTr works, the proposed scheme was simulated in different scenarios including single/multi-user settings, different channel conditions, and for both discrete and continuous input data. The results show that ProgTr can achieve better performance compared to conventional modulation methods. In addition to performance metrics such as BER, bit-wise mutual information is used to provide some interpretation to how the transmitter and receiver operate in ProgTr.
Has companion code repository: https://github.com/safarisadegh/Progressive_transmission
This page was built for publication: Progressive Transmission using Recurrent Neural Networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6374437)