Generalized and Unified Equivalences between Hardness and Pseudoentropy

šŸ“… 2025-07-08
šŸ“ˆ Citations: 0
✨ Influential: 0
šŸ“„ PDF
šŸ¤– AI Summary
Existing characterizations of pseudorandomness via pseudoentropy lack a unified framework that simultaneously accommodates both uniform and non-uniform computational models and general entropy measures—including Shannon entropy and min-entropy. Method: We propose the first generic pseudoentropy characterization framework, built upon weight-constrained calibration—a novel refinement of the classical complexity regularity lemma—achieving exponential improvement in alphabet-size dependence. Our approach integrates multi-granularity analysis with computational indistinguishability to yield a scalable pseudoentropy paradigm. Contribution/Results: We establish the first rigorous proof that multi-calibration—and even its weakened variants—necessarily incur an exponential lower bound in alphabet size, thereby demonstrating the tightness and optimality of our framework. This resolves a fundamental gap in the theoretical foundations of computational hardness–randomness equivalence, providing a unified, entropy-agnostic, and model-flexible characterization of pseudoentropy.

Technology Category

Application Category

šŸ“ Abstract
Pseudoentropy characterizations provide a quantitatively precise demonstration of the close relationship between computational hardness and computational randomness. We prove a unified pseudoentropy characterization that generalizes and strengthens previous results for both uniform and non-uniform models of computation. Our characterization holds for a general family of entropy notions that encompasses the common notions of Shannon entropy and min entropy as special cases. Moreover, we show that the characterizations for different entropy notions can be simultaneously achieved by a single, universal function that simultaneously witnesses computational hardness and computational randomness. A key technical insight of our work is that the notion of weight-restricted calibration from the recent literature on algorithm fairness, along with standard computational indistinguishability (known as multiaccuracy in the fairness literature), suffices for proving pseudoentropy characterizations for general entropy notions. This demonstrates the power of weight-restricted calibration to enhance the classic Complexity-Theoretic Regularity Lemma (Trevisan, Tulsiani, and Vadhan, 2009) and Leakage Simulation Lemma (Jetchev and Pietrzak, 2014) and allows us to achieve an exponential improvement in the complexity dependency on the alphabet size compared to the pseudoentropy characterizations by Casacuberta, Dwork, and Vadhan (2024) based on the much stronger notion of multicalibration. We show that the exponential dependency on the alphabet size is inevitable for multicalibration as well as for the weaker notion of calibrated multiaccuracy.
Problem

Research questions and friction points this paper is trying to address.

Unify pseudoentropy characterizations for computational hardness and randomness
Generalize entropy notions including Shannon and min entropy
Improve complexity dependency on alphabet size exponentially
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unified pseudoentropy characterization for computational models
Single universal function for hardness and randomness
Weight-restricted calibration enhances complexity-theoretic lemmas
šŸ”Ž Similar Papers
No similar papers found.