Group Probability Decoding of Turbo Product Codes over Higher-Order Fields

📅 2025-11-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional binary turbo product codes (TPCs) suffer from loss of inter-bit correlations—arising from intersymbol interference or internal component-code decoding—during log-likelihood ratio (LLR) propagation, limiting error-correction performance. To address this, we propose a group-probability-based decoding framework, providing the first theoretical analysis quantifying its mutual information and signal-to-noise ratio (SNR) gains over conventional bit-probability decoding. We further design a non-binary TPC architecture that explicitly conveys group probabilities, and integrate symbol-level ORBGRAND and SOGRAND soft-output techniques to partially preserve both intrinsic and extrinsic correlations. Experimental results demonstrate that, under intrinsic and extrinsic correlation scenarios, the proposed method achieves up to 0.3 dB and 0.7 dB SNR gains, respectively, significantly improving decoding performance.

Technology Category

Application Category

📝 Abstract
Binary turbo product codes (TPCs) are powerful error-correcting codes constructed from short component codes. Traditionally, turbo product decoding passes log likelihood ratios (LLRs) between the component decoders, inherently losing information when bit correlation exists. Such correlation can arise exogenously from sources like intersymbol interference and endogenously during component code decoding. To preserve these correlations and improve performance, we propose turbo product decoding based on group probabilities. We theoretically predict mutual information and signal-to-noise ratio (SNR) gains of group over bit-probability decoding. To translate these theoretical insights to practice, we revisit non-binary TPCs that naturally support group-probability decoding. We show that any component list decoder that takes group probabilities as input and outputs block-wise soft-output can partially preserve bit correlation, which we demonstrate with symbol-level ORBGRAND combined with soft-output GRAND (SOGRAND). Our results demonstrate that group-probability-based turbo product decoding achieves SNR gains of up to 0.3 dB for endogenous correlation and 0.7 dB for exogenous correlation, compared to bit-probability decoding.
Problem

Research questions and friction points this paper is trying to address.

Addresses information loss in turbo product codes due to bit correlation.
Proposes group-probability decoding to preserve correlations and improve performance.
Demonstrates SNR gains for both endogenous and exogenous correlation scenarios.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Group probability decoding for turbo product codes
Non-binary TPCs naturally support group-probability decoding
Symbol-level ORBGRAND with SOGRAND preserves bit correlation
🔎 Similar Papers
No similar papers found.
Lukas Rapp
Lukas Rapp
MIT
Error Correction CodingInformation Theory
M
Muriel M'edard
Massachusetts Institute of Technology Network Coding & Reliable Communications Group
K
Ken R. Duffy
Northeastern University Engineering Probability Information & Communications Laboratory