🤖 AI Summary
This study investigates the fundamental performance limits of linear encoding in binary distributed hypothesis testing under communication constraints. Specifically, it considers a setting where two agents observe correlated binary vectors and transmit compressed information to a fusion center at equal rates, with a focus on evaluating the efficacy of linear compression schemes. Employing information-theoretic techniques—including truncated coding, random coding, and exponential analysis—the work establishes that, within the class of linear encoders, simple truncation is optimal for canonical tests such as sign-flipped and independence hypotheses. However, it also demonstrates that linear codes are strictly suboptimal for the independence test when compared to the broader class of all possible encoding strategies. Numerical experiments further corroborate the superiority of truncation-based schemes across a wider range of configurations.
📝 Abstract
We study a binary distributed hypothesis testing problem where two agents observe correlated binary vectors and communicate compressed information at the same rate to a central decision maker. In particular, we study linear compression schemes and show that simple truncation is the best linear scheme in two cases: (1) testing opposite signs of the same magnitude of correlation, and (2) testing for or against independence. We conjecture, supported by numerical evidence, that truncation is the best linear code for testing any correlations of opposite signs. Further, for testing against independence, we also compute classical random coding exponents and show that truncation, and consequently any linear code, is strictly suboptimal.