🤖 AI Summary
This work investigates the maximization of mutual information between the input and output of a Boolean function over a binary symmetric channel in the high-noise regime, with a focus on the Courtade–Kumar conjecture. By integrating tools from information theory, Fourier analysis, and entropy expansions, the study generalizes the upper bound on the sum of coordinate-wise mutual informations to arbitrary biased Boolean functions, establishing that it does not exceed \(1 - H(\alpha)\). In the high-noise region where the crossover probability \(\lambda \to 0\), the authors derive an optimal \(O(\lambda^2)\) error bound for the entropy expansion and obtain a strong Fourier concentration result. These advances significantly extend the range of noise parameters for which the Courtade–Kumar conjecture is known to hold.
📝 Abstract
The Courtade-Kumar conjecture posits that dictatorship functions maximize the mutual information between the function's output and a noisy version of its input over the Boolean hypercube. We present two significant advancements related to this conjecture. First, we resolve an open question posed by Courtade and Kumar, proving that for any Boolean function (regardless of bias), the sum of mutual information between the function's output and the individual noisy input coordinates is bounded by $1-H(\alpha)$, where $\alpha$ is the noise parameter of the Binary Symmetric Channel. This generalizes their previous result which was restricted to balanced Boolean functions. Second, we advance the study of the main conjecture in the high noise regime. We establish an optimal error bound of $O(\lambda^2)$ for the asymptotic entropy expansion, where $\lambda = (1-2\alpha)^2$, improving upon the previous best-known bounds. This refined analysis leads to a sharp, linear Fourier concentration bound for highly informative functions and significantly extends the range of the noise parameter $\lambda$ for which the conjecture is proven to hold.