🤖 AI Summary
This paper establishes a direct-sum theorem for randomized parity decision trees: whether solving $k$ independent instances requires $Omega(k)$ times the complexity of solving a single instance. Addressing the long-standing absence of direct-sum lower bounds in this model, the work provides the first general direct-sum theorem, unifying proofs under two mainstream lower-bound frameworks—sensitivity-based discrepancy methods and product-distribution techniques—and confirming $Omega(k)$-factor complexity blowup. Technically, the proof integrates analysis of randomized decision tree complexity, distribution-sensitive sensitivity characterizations, and error control under product distributions. The result fills a fundamental gap in parity decision tree theory and extends the applicability of direct-sum principles beyond communication complexity to finer-grained algebraic query models. By doing so, it introduces a new paradigm for multi-instance lower-bound analysis in query complexity.
📝 Abstract
Direct sum theorems state that the cost of solving $k$ instances of a problem is at least $Omega(k)$ times the cost of solving a single instance. We prove the first such results in the randomised parity decision tree model. We show that a direct sum theorem holds whenever (1) the lower bound for parity decision trees is proved using the discrepancy method; or (2) the lower bound is proved relative to a product distribution.