π€ AI Summary
In GRAND decoding of binary linear codes, there exists a fundamental trade-off between excessive guessing attempts and degraded block error rate (BLER) performance.
Method: This paper proposes a dynamic list termination mechanism based on soft-output error probability estimation, which adaptively determines the optimal termination point during decoding without relying on structural priors of the codeβthereby overcoming the inherent accuracy degradation associated with conventional structure-aided decoding.
Contribution/Results: Experimental results demonstrate that the proposed scheme reduces the number of guesses by up to 32Γ while maintaining BLER performance identical to that of Guessing Codeword Decoding (GCD), the baseline GRAND variant. To the best of our knowledge, this is the first general-purpose GRAND decoding strategy that significantly lowers computational complexity without any BLER penalty.
π Abstract
Proposals have been made to reduce the guesswork of Guessing Random Additive Noise Decoding (GRAND) for binary linear codes by leveraging codebook structure at the expense of degraded block error rate (BLER). We establish one can preserve guesswork reduction while eliminating BLER degradation through dynamic list decoding terminated based on Soft Output GRAND's error probability estimate. We illustrate the approach with a method inspired by published literature and compare performance with Guessing Codeword Decoding (GCD). We establish that it is possible to provide the same BLER performance as GCD while reducing guesswork by up to a factor of 32.