Robust Deep Joint Source Channel Coding for Task-Oriented Semantic Communications

📅 2025-03-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the insufficient robustness of deep joint source-channel coding (JSCC) in task-oriented semantic communication under stochastic channel conditions, this paper proposes a generic KL-divergence-based regularization method that enhances task performance by enforcing consistency between the noisy and noiseless posterior distributions. Innovatively, the expected KL divergence term is analytically approximated using the Fisher information matrix and noise covariance, enabling architecture-agnostic robustness enhancement without modifying network architecture or incurring additional inference overhead. Experiments across diverse channel models—including additive white Gaussian noise (AWGN) channels—and semantic tasks—such as image classification and object detection—demonstrate consistent and significant improvements in accuracy and robustness. The proposed method thus provides a principled, lightweight, and broadly applicable solution for improving JSCC reliability in uncertain wireless environments.

Technology Category

Application Category

📝 Abstract
Semantic communications based on deep joint source-channel coding (JSCC) aim to improve communication efficiency by transmitting only task-relevant information. However, ensuring robustness to the stochasticity of communication channels remains a key challenge in learning-based JSCC. In this paper, we propose a novel regularization technique for learning-based JSCC to enhance robustness against channel noise. The proposed method utilizes the Kullback-Leibler (KL) divergence as a regularizer term in the training loss, measuring the discrepancy between two posterior distributions: one under noisy channel conditions (noisy posterior) and one for a noise-free system (noise-free posterior). Reducing this KL divergence mitigates the impact of channel noise on task performance by keeping the noisy posterior close to the noise-free posterior. We further show that the expectation of the KL divergence given the encoded representation can be analytically approximated using the Fisher information matrix and the covariance matrix of the channel noise. Notably, the proposed regularization is architecture-agnostic, making it broadly applicable to general semantic communication systems over noisy channels. Our experimental results validate that the proposed regularization consistently improves task performance across diverse semantic communication systems and channel conditions.
Problem

Research questions and friction points this paper is trying to address.

Enhance robustness of deep joint source-channel coding against channel noise.
Propose KL divergence regularization to mitigate noise impact on task performance.
Develop architecture-agnostic method applicable to various semantic communication systems.
Innovation

Methods, ideas, or system contributions that make the work stand out.

KL divergence regularizes noisy channel impact
Fisher information approximates KL divergence expectation
Architecture-agnostic regularization enhances semantic communication
🔎 Similar Papers
No similar papers found.