🤖 AI Summary
This work addresses the challenge of achieving efficient, low-communication-cost distributed semantic communication and function computation under strong privacy guarantees. To this end, the authors propose a Randomized Distributed Function Computation (RDFC) framework, which formulates semantic communication as a generalized remote source coding problem wherein transmitters send only the minimal information required for receivers to compute randomized functions of the input data, while embedding local differential privacy mechanisms. The framework innovatively unifies semantic communication and privacy protection without requiring shared randomness, yet still satisfies rigorous privacy guarantees; it further demonstrates that shared randomness can substantially reduce communication rates. Theoretical analysis leverages strong coordination measures, Wyner’s common information, numerical optimization over continuous variables, and finite blocklength techniques. Experiments show that with sufficient shared randomness, communication rates drop by two orders of magnitude below the Wyner common information point, while even without shared randomness, performance significantly surpasses lossless transmission, and the privacy parameter converges exponentially with input length in non-asymptotic regimes.
📝 Abstract
We establish the randomized distributed function computation (RDFC) framework, in which a sender transmits just enough information for a receiver to generate a randomized function of the input data. Describing RDFC as a form of semantic communication, which can be essentially seen as a generalized remote-source-coding problem, we show that security and privacy constraints naturally fit this model, as they generally require a randomization step. Using strong coordination metrics, we ensure (local differential) privacy for every input sequence and prove that such guarantees can be met even when no common randomness is shared between the transmitter and receiver.
This work provides lower bounds on Wyner's common information (WCI), which is the communication cost when common randomness is absent, and proposes numerical techniques to evaluate the other corner point of the RDFC rate region for continuous-alphabet random variables with unlimited shared randomness. Experiments illustrate that a sufficient amount of common randomness can reduce the semantic communication rate by up to two orders of magnitude compared to the WCI point, while RDFC without any shared randomness still outperforms lossless transmission by a large margin. A finite blocklength analysis further confirms that the privacy parameter gap between the asymptotic and non-asymptotic RDFC methods closes exponentially fast with input length. Our results position RDFC as an energy-efficient semantic communication strategy for privacy-aware distributed computation systems.