Strong XOR Lemma for Information Complexity

📅 2024-11-20
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper investigates the information complexity lower bound for computing the $n$-fold XOR function $f^{oplus n}$ in the two-player randomized communication model, aiming to determine whether the naïve strategy—computing each $f(X_i)$ independently and then XORing the results—is information-theoretically optimal. The authors establish the first strong XOR lemma: if computing $f$ with constant error requires leaking at least $I$ bits of information, then computing $f^{oplus n}$ with constant error requires leaking $Omega(n)(I - 1 - o_n(1))$ bits; when the error tolerance shrinks to $O(n^{-1})$, the bound tightens to $Omega(n)(I - 1)$. Technically, the proof integrates information complexity analysis, lower-bound tools for randomized protocols, and modeling of the error–resource trade-off. The result demonstrates linear additivity of information complexity under XOR composition, thereby establishing the information-theoretic optimality of the naïve strategy. This work introduces a key new paradigm at the intersection of communication complexity and information theory.

Technology Category

Application Category

📝 Abstract
For any {0,1}-valued function f, its n-folded XOR is the function f⊕ n where f⊕ n(X1, …, Xn) = f(X1) ⊕ ⋯ ⊕ f(Xn). Given a procedure for computing the function f, one can apply a “naive” approach to compute f⊕ n by computing each f(Xi) independently, followed by XORing the outputs. This approach uses n times the resources required for computing f. In this paper, we prove a strong XOR lemma for information complexity in the two-player randomized communication model: if computing f with an error probability of O(n−1) requires revealing I bits of information about the players’ inputs, then computing f⊕ n with a constant error requires revealing Ω(n) · (I − 1 − on(1)) bits of information about the players’ inputs. Our result demonstrates that the naive protocol for computing f⊕ n is both information-theoretically optimal and asymptotically tight in error trade-offs.
Problem

Research questions and friction points this paper is trying to address.

Proving strong XOR lemma for information complexity
Analyzing naive protocol optimality for XOR functions
Establishing tight error trade-offs in communication
Innovation

Methods, ideas, or system contributions that make the work stand out.

Strong XOR lemma for information complexity
Optimal naive protocol for XOR computation
Tight error trade-offs in communication
🔎 Similar Papers
No similar papers found.
P
Pachara Sawettamalya
Department of Computer Science, Princeton University
Huacheng Yu
Huacheng Yu
Computer Science, Princeton University
Theory