🤖 AI Summary
This paper investigates the statistical distance between high-dimensional permuted mixture distributions and their corresponding i.i.d. counterparts, aiming to derive non-asymptotic, dimension-free, and tight bounds. Methodologically, it introduces a novel geometric framework integrating spectral analysis, information geometry, and mean-field theory—characterizing the intrinsic link between the spectrum of the channel overlap matrix and the underlying information-geometric structure—and conducts a refined mean-field analysis of permutation-invariant decision rules, establishing strong non-asymptotic equivalence of composite regret under two canonical definitions. The results uncover dimension-driven phase transitions in Gaussian and Poisson models, bridging critical gaps in existing theory regarding non-asymptotic precision and model generality. Notably, this work provides the first unified, tight, and computationally tractable performance characterization for composite decision problems.
📝 Abstract
We develop sharp bounds on the statistical distance between high-dimensional permutation mixtures and their i.i.d. counterparts. Our approach establishes a new geometric link between the spectrum of a complex channel overlap matrix and the information geometry of the channel, yielding tight dimension-independent bounds that close gaps left by previous work. Within this geometric framework, we also derive dimension-dependent bounds that uncover phase transitions in dimensionality for Gaussian and Poisson families. Applied to compound decision problems, this refined control of permutation mixtures enables sharper mean-field analyses of permutation-invariant decision rules, yielding strong non-asymptotic equivalence results between two notions of compound regret in Gaussian and Poisson models.