š¤ AI Summary
Existing characterizations of pseudorandomness via pseudoentropy lack a unified framework that simultaneously accommodates both uniform and non-uniform computational models and general entropy measuresāincluding Shannon entropy and min-entropy.
Method: We propose the first generic pseudoentropy characterization framework, built upon weight-constrained calibrationāa novel refinement of the classical complexity regularity lemmaāachieving exponential improvement in alphabet-size dependence. Our approach integrates multi-granularity analysis with computational indistinguishability to yield a scalable pseudoentropy paradigm.
Contribution/Results: We establish the first rigorous proof that multi-calibrationāand even its weakened variantsānecessarily incur an exponential lower bound in alphabet size, thereby demonstrating the tightness and optimality of our framework. This resolves a fundamental gap in the theoretical foundations of computational hardnessārandomness equivalence, providing a unified, entropy-agnostic, and model-flexible characterization of pseudoentropy.
š Abstract
Pseudoentropy characterizations provide a quantitatively precise demonstration of the close relationship between computational hardness and computational randomness. We prove a unified pseudoentropy characterization that generalizes and strengthens previous results for both uniform and non-uniform models of computation. Our characterization holds for a general family of entropy notions that encompasses the common notions of Shannon entropy and min entropy as special cases. Moreover, we show that the characterizations for different entropy notions can be simultaneously achieved by a single, universal function that simultaneously witnesses computational hardness and computational randomness. A key technical insight of our work is that the notion of weight-restricted calibration from the recent literature on algorithm fairness, along with standard computational indistinguishability (known as multiaccuracy in the fairness literature), suffices for proving pseudoentropy characterizations for general entropy notions. This demonstrates the power of weight-restricted calibration to enhance the classic Complexity-Theoretic Regularity Lemma (Trevisan, Tulsiani, and Vadhan, 2009) and Leakage Simulation Lemma (Jetchev and Pietrzak, 2014) and allows us to achieve an exponential improvement in the complexity dependency on the alphabet size compared to the pseudoentropy characterizations by Casacuberta, Dwork, and Vadhan (2024) based on the much stronger notion of multicalibration. We show that the exponential dependency on the alphabet size is inevitable for multicalibration as well as for the weaker notion of calibrated multiaccuracy.