Segmented GRAND: Combining Sub-patterns in Near-ML Order

From MaRDI portal
Publication:6437766

arXiv2305.14892MaRDI QIDQ6437766

Author name not available (Why is that?)

Publication date: 24 May 2023

Abstract: The recently introduced maximum-likelihood (ML) decoding scheme called guessing random additive noise decoding (GRAND) has demonstrated a remarkably low time complexity in high signal-to-noise ratio (SNR) regimes. However, the complexity is not as low at low SNR regimes and low code rates. To mitigate this concern, we propose a scheme for a near-ML variant of GRAND called ordered reliability bits GRAND (or ORBGRAND), which divides codewords into segments based on the properties of the underlying code, generates sub-patterns for each segment consistent with the syndrome (thus reducing the number of inconsistent error patterns generated), and combines them in a near-ML order using two-level integer partitions of logistic weight. The numerical evaluation demonstrates that the proposed scheme, called segmented ORBGRAND, significantly reduces the average number of queries at any SNR regime. Moreover, the segmented ORBGRAND with abandonment also improves the error correction performance.




Has companion code repository: https://github.com/mohammad-rowshan/segmented-grand








This page was built for publication: Segmented GRAND: Combining Sub-patterns in Near-ML Order

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6437766)