🤖 AI Summary
This study addresses the stability of Ghurye–Olkin (GO) Gaussian vector characterizations: under approximate satisfaction of the GO independence condition, do component sums within equivalence classes and overall projections remain approximately Gaussian? The paper establishes, for the first time, a unified stability analysis of GO characterizations in both the characteristic function (c.f.) and distribution function (d.f.) domains. Leveraging Kac–Bernstein and Cramér-type techniques, equivalence-class decomposition, and distributional approximation methods, it rigorously proves that near-GO independence implies both isotropic near-Gaussianity and near-Gaussianity of class-wise sums. Furthermore, it derives quantitative stability bounds for differential entropy and provides identifiability guarantees for blind source separation from non-Gaussian sources. The core innovation lies in the dual-domain synergistic analytical framework and its novel extensions to information theory and blind signal processing.
📝 Abstract
The stability of the Ghurye-Olkin (GO) characterization of Gaussian vectors is analyzed using a partition of the vectors into equivalence classes defined by their matrix factors. The sum of the vectors in each class is near-Gaussian in the characteristic function (c.f.) domain if the GO independence condition is approximately met in the c.f. domain. All vectors have the property that any vector projection is near-Gaussian in the distribution function (d.f.) domain. The proofs of these c.f. and d.f. stabilities use tools that establish the stabilities of theorems by Kac-Bernstein and Cram'er, respectively. The results are used to prove stability theorems for differential entropies of Gaussian vectors and blind source separation of non-Gaussian sources.