🤖 AI Summary
Traditional binary turbo product codes (TPCs) suffer from loss of inter-bit correlations—arising from intersymbol interference or internal component-code decoding—during log-likelihood ratio (LLR) propagation, limiting error-correction performance. To address this, we propose a group-probability-based decoding framework, providing the first theoretical analysis quantifying its mutual information and signal-to-noise ratio (SNR) gains over conventional bit-probability decoding. We further design a non-binary TPC architecture that explicitly conveys group probabilities, and integrate symbol-level ORBGRAND and SOGRAND soft-output techniques to partially preserve both intrinsic and extrinsic correlations. Experimental results demonstrate that, under intrinsic and extrinsic correlation scenarios, the proposed method achieves up to 0.3 dB and 0.7 dB SNR gains, respectively, significantly improving decoding performance.
📝 Abstract
Binary turbo product codes (TPCs) are powerful error-correcting codes constructed from short component codes. Traditionally, turbo product decoding passes log likelihood ratios (LLRs) between the component decoders, inherently losing information when bit correlation exists. Such correlation can arise exogenously from sources like intersymbol interference and endogenously during component code decoding. To preserve these correlations and improve performance, we propose turbo product decoding based on group probabilities. We theoretically predict mutual information and signal-to-noise ratio (SNR) gains of group over bit-probability decoding. To translate these theoretical insights to practice, we revisit non-binary TPCs that naturally support group-probability decoding. We show that any component list decoder that takes group probabilities as input and outputs block-wise soft-output can partially preserve bit correlation, which we demonstrate with symbol-level ORBGRAND combined with soft-output GRAND (SOGRAND). Our results demonstrate that group-probability-based turbo product decoding achieves SNR gains of up to 0.3 dB for endogenous correlation and 0.7 dB for exogenous correlation, compared to bit-probability decoding.