The Conditional Regret-Capacity Theorem for Batch Universal Prediction

📅 2025-08-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work investigates the fundamental lower bound on the *minimum average regret* in batch universal prediction—i.e., the performance limit of a predictor granted access to a batch of training data. Methodologically, it integrates information-theoretic tools, conditional regret analysis, and batch learning modeling to derive rigorous theoretical bounds. The key contribution is the *Conditional Regret Capacity Theorem*, which establishes, for the first time, a precise characterization of the batch regret lower bound in terms of the conditional Rényi divergence and conditional Sibson mutual information. This unifies classical regret analysis within the frameworks of batch learning and generalized Rényi information measures. The bound is validated on the class of binary memoryless sources, yielding the first information-theoretically interpretable performance lower bound for batch universal prediction. The result significantly advances the foundational theory of universal prediction under batch settings.

Technology Category

Application Category

📝 Abstract
We derive a conditional version of the classical regret-capacity theorem. This result can be used in universal prediction to find lower bounds on the minimal batch regret, which is a recently introduced generalization of the average regret, when batches of training data are available to the predictor. As an example, we apply this result to the class of binary memoryless sources. Finally, we generalize the theorem to Rényi information measures, revealing a deep connection between the conditional Rényi divergence and the conditional Sibson's mutual information.
Problem

Research questions and friction points this paper is trying to address.

Extends regret-capacity theorem to conditional cases
Finds lower bounds for batch regret in prediction
Links Rényi divergence to Sibson's mutual information
Innovation

Methods, ideas, or system contributions that make the work stand out.

Conditional regret-capacity theorem derivation
Lower bounds on minimal batch regret
Generalization to Rényi information measures
🔎 Similar Papers
No similar papers found.