Neural Compress-and-Forward for the Primitive Diamond Relay Channel

📅 2025-12-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the distributed neural compress-and-forward (CF) problem over diamond relay channels, aiming for fully distributed collaborative compression—without inter-relay coordination—while exploiting source-relay input correlations. We propose an end-to-end learned framework for distributed quantization and joint decoding, incorporating differentiable quantizers and finite-order modulation constraints to circumvent conventional codebook design and explicit coordination overhead. The method approaches the Berger–Tung outer bound without inter-relay communication and achieves performance close to known capacity bounds in the two-relay setting. Key contributions include: (i) the first extension of neural CF to multi-relay topologies; (ii) the establishment of the first distributed learning-based compression paradigm that requires no inter-relay signaling; and (iii) empirical validation of both scalability and near-optimality of neural compression schemes in complex relay networks.

Technology Category

Application Category

📝 Abstract
The diamond relay channel, where a source communicates with a destination via two parallel relays, is one of the canonical models for cooperative communications. We focus on the primitive variant, where each relay observes a noisy version of the source signal and forwards a compressed description over an orthogonal, noiseless, finite-rate link to the destination. Compress-and-forward (CF) is particularly effective in this setting, especially under oblivious relaying where relays lack access to the source codebook. While neural CF methods have been studied in single-relay channels, extending them to the two-relay case is non-trivial, as it requires fully distributed compression without any inter-relay coordination. We demonstrate that learning-based quantizers at the relays can harness input correlations by operating remote, yet in a collaborative fashion, enabling effective distributed compression in line with Berger-Tung-style coding. Each relay separately compresses its observation using a one-shot learned quantizer, and the destination jointly decodes the source message. Simulation results show that the proposed scheme, trained end-to-end with finite-order modulation, operates close to the known theoretical bounds. These results demonstrate that neural CF can scale to multi-relay systems while maintaining both performance and interpretability.
Problem

Research questions and friction points this paper is trying to address.

Extends neural compress-and-forward to two-relay diamond channels without coordination
Enables distributed compression at relays using collaborative learning-based quantizers
Achieves near-theoretical performance with end-to-end training and finite modulation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Neural quantizers enable distributed compression without inter-relay coordination
Learned quantizers harness input correlations through collaborative remote operation
End-to-end training with finite-order modulation approaches theoretical bounds
🔎 Similar Papers
No similar papers found.
O
Ozan Aygün
Department of Electrical and Computer Engineering, New York University, Brooklyn, NY
Ezgi Ozyilkan
Ezgi Ozyilkan
PhD Student, New York University
Neural CompressionInformation TheoryMachine Learning
E
Elza Erkip
Department of Electrical and Computer Engineering, New York University, Brooklyn, NY