The Sample Complexity of Distributed Simple Binary Hypothesis Testing under Information Constraints

📅 2025-06-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses two long-standing open problems in distributed simple binary hypothesis testing: (1) whether interaction reduces sample complexity, and (2) whether existing sample complexity bounds under communication constraints are tight. To resolve them, we derive a tensorized lower bound on the Bayesian error, establish an inverse data-processing inequality for the Hellinger-λ divergence, and conduct reverse information-flow analysis. We rigorously prove—*for the first time*—that sequential interaction provides no reduction in the minimal sample requirement under a single-bit communication constraint per round. Consequently, we obtain the optimal, tight sample complexity: Θ(1/(ε² log(1/δ))), identical to the noninteractive setting. This result definitively refutes the “interaction gain” conjecture and establishes the fundamental limit of communication-constrained distributed hypothesis testing.

Technology Category

Application Category

📝 Abstract
This paper resolves two open problems from a recent paper, arXiv:2403.16981, concerning the sample complexity of distributed simple binary hypothesis testing under information constraints. The first open problem asks whether interaction reduces the sample complexity of distributed simple binary hypothesis testing. In this paper, we show that sequential interaction does not help. The second problem suggests tightening existing sample complexity bounds for communication-constrained simple binary hypothesis testing. We derive optimally tight bounds for this setting and resolve this problem. Our main technical contributions are: (i) a one-shot lower bound on the Bayes error in simple binary hypothesis testing that satisfies a crucial tensorisation property; (ii) a streamlined proof of the formula for the sample complexity of simple binary hypothesis testing without constraints, first established in arXiv:2403.16981; and (iii) a reverse data-processing inequality for Hellinger-$lambda$ divergences, generalising the results from arXiv:1812.03031 and arXiv:2206.02765.
Problem

Research questions and friction points this paper is trying to address.

Does interaction reduce sample complexity in distributed binary hypothesis testing?
Tightening bounds for communication-constrained binary hypothesis testing.
Deriving optimal bounds for distributed binary hypothesis testing.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Sequential interaction does not reduce sample complexity
Optimally tight bounds for communication-constrained testing
Reverse data-processing inequality for Hellinger-λ divergences
🔎 Similar Papers