🤖 AI Summary
This work resolves a long-standing open problem concerning the sign-rank of the $k$-Hamming distance matrix: whether it grows with the input dimension $n$. Leveraging combinatorial matrix analysis, structured dimensionality reduction, kernel techniques, and reductions from communication complexity, we construct an upper bound of $2^{O(k)}$ on the sign-rank. Crucially, this bound depends only on $k$ and is independent of $n$, establishing for the first time that the sign-rank is a constant in $n$. This refutes the central conjecture—advanced in RANDOM’22 and STOC’23—that the sign-rank must scale with $n$, thereby overturning the prevailing belief that high-dimensional distance matrices inherently become more complex as dimension increases. Our result provides new dichotomy criteria for unbounded-error communication complexity and margin-based learning, and unifies the intrinsic complexity characterization of diverse large-margin and high-dimensional distance matrices.
📝 Abstract
We prove that the sign-rank of the $k$-Hamming Distance matrix on $n$ bits is $2^{O(k)}$, independent of the number of bits $n$. This strongly refutes the conjecture of Hatami, Hatami, Pires, Tao, and Zhao (RANDOM 2022), and Hatami, Hosseini, and Meng (STOC 2023), repeated in several other papers, that the sign-rank should depend on $n$. This conjecture would have qualitatively separated margin from sign-rank (or, equivalently, bounded-error from unbounded-error randomized communication). In fact, our technique gives constant sign-rank upper bounds for all matrices which reduce to $k$-Hamming Distance, as well as large-margin matrices recently shown to be irreducible to $k$-Hamming Distance.