Spectral-Aligned Pruning for Universal Error-Correcting Code Transformers

📅 2026-02-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of deploying general-purpose error-correcting code (ECC) Transformer decoders, which are hindered by high computational complexity and large parameter counts. To overcome this, the authors propose the Spectral Alignment Pruning (SAP) framework, which— for the first time—incorporates the spectral properties of ECC bipartite graphs into structured pruning. SAP generates pruning masks that are shareable across different code types and integrates low-rank adaptation (LoRA) for lightweight, code-specific performance recovery. The method achieves decoding performance comparable to code-specific pruning strategies across multiple ECC families while substantially reducing both computational overhead and memory footprint, effectively balancing efficiency with generalization capability.

Technology Category

Application Category

📝 Abstract
Recently, the Foundation Error Correction Code Transformer (FECCT) has emerged as a promising universal channel decoder, achieving competitive decoding performance across diverse code families by relying on a single shared model backbone, optionally followed by code-specific retraining. Despite this flexibility, the high computational complexity and large parameter footprint of transformer-based decoders present substantial obstacles to practical deployment. To address these challenges, we investigate structured pruning for FECCT and propose Spectral-Aligned Pruning (SAP), a structure-aware framework that enables cross-code reuse of structured pruning masks across codes by leveraging the spectrum of the corresponding bipartite graph. After pruning, SAP performs per-code recovery via parameter-efficient low-rank adaptation (LoRA), enabling a shared pruned backbone while storing only small code-specific adapter parameters. Experiments across diverse codes show that SAP achieves decoding performance comparable to dedicated per-code pruning, while enabling substantial reductions in computational cost and model memory footprint through kernel-level structured pruning.
Problem

Research questions and friction points this paper is trying to address.

structured pruning
universal channel decoding
computational complexity
model compression
error-correcting codes
Innovation

Methods, ideas, or system contributions that make the work stand out.

Spectral-Aligned Pruning
structured pruning
universal channel decoding
low-rank adaptation
graph spectrum
🔎 Similar Papers
No similar papers found.
S
Sanghyeon Cho
Department of Electrical Engineering, Pohang University of Science and Technology (POSTECH), Pohang 37673, South Korea
Taewoo Park
Taewoo Park
POSTECH
communicationmachine learninginformation theory
S
Seong-Joon Park
Institute of Artificial Intelligence, Pohang University of Science and Technology (POSTECH), Pohang 37673, South Korea
D
Dae-Young Yun
Department of Electrical, Electronic and Computer Engineering, University of Ulsan, Ulsan 44610, South Korea
H
Hee-Youl Kwak
Department of Electrical, Electronic and Computer Engineering, University of Ulsan, Ulsan 44610, South Korea
S
Sang-Hyo Kim
Department of Electrical and Computer Engineering, Sungkyunkwan University, Suwon 16419, South Korea
Yongjune Kim
Yongjune Kim
Associate Professor of Electrical Engineering, POSTECH
coding theoryinformation theorycommunicationsmachine learningartificial intelligence