On the Minimax Regret of Sequential Probability Assignment via Square-Root Entropy

📅 2025-03-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper investigates the minimax regret for sequential probability assignment under logarithmic loss, addressing both settings with and without side information. To characterize problem complexity, the authors introduce “sequential square-root entropy,” a novel geometric complexity measure. In the no-side-information setting, this measure yields the first tight upper bound on the Shtarkov sum. For the side-information setting, the analysis integrates Hellinger distance, covering numbers, and scale-sensitive dimensions to establish matching upper and lower regret bounds—up to logarithmic factors—and achieves optimal convergence rates for Donsker classes. Collectively, the work establishes a precise quantitative link between sequential prediction regret and the geometric structure of function classes, substantially advancing the fine-grained theory of regret bounds.

Technology Category

Application Category

📝 Abstract
We study the problem of sequential probability assignment under logarithmic loss, both with and without side information. Our objective is to analyze the minimax regret -- a notion extensively studied in the literature -- in terms of geometric quantities, such as covering numbers and scale-sensitive dimensions. We show that the minimax regret for the case of no side information (equivalently, the Shtarkov sum) can be upper bounded in terms of sequential square-root entropy, a notion closely related to Hellinger distance. For the problem of sequential probability assignment with side information, we develop both upper and lower bounds based on the aforementioned entropy. The lower bound matches the upper bound, up to log factors, for classes in the Donsker regime (according to our definition of entropy).
Problem

Research questions and friction points this paper is trying to address.

Analyze minimax regret in sequential probability assignment
Bound regret using geometric quantities like entropy
Develop bounds for cases with and without side information
Innovation

Methods, ideas, or system contributions that make the work stand out.

Sequential square-root entropy bounds
Geometric quantities analysis
Donsker regime matching bounds
🔎 Similar Papers
No similar papers found.