Stability of the Ghurye-Olkin Characterization of Vector Gaussian Distributions

📅 2024-05-02
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the stability of Ghurye–Olkin (GO) Gaussian vector characterizations: under approximate satisfaction of the GO independence condition, do component sums within equivalence classes and overall projections remain approximately Gaussian? The paper establishes, for the first time, a unified stability analysis of GO characterizations in both the characteristic function (c.f.) and distribution function (d.f.) domains. Leveraging Kac–Bernstein and Cramér-type techniques, equivalence-class decomposition, and distributional approximation methods, it rigorously proves that near-GO independence implies both isotropic near-Gaussianity and near-Gaussianity of class-wise sums. Furthermore, it derives quantitative stability bounds for differential entropy and provides identifiability guarantees for blind source separation from non-Gaussian sources. The core innovation lies in the dual-domain synergistic analytical framework and its novel extensions to information theory and blind signal processing.

Technology Category

Application Category

📝 Abstract
The stability of the Ghurye-Olkin (GO) characterization of Gaussian vectors is analyzed using a partition of the vectors into equivalence classes defined by their matrix factors. The sum of the vectors in each class is near-Gaussian in the characteristic function (c.f.) domain if the GO independence condition is approximately met in the c.f. domain. All vectors have the property that any vector projection is near-Gaussian in the distribution function (d.f.) domain. The proofs of these c.f. and d.f. stabilities use tools that establish the stabilities of theorems by Kac-Bernstein and Cram'er, respectively. The results are used to prove stability theorems for differential entropies of Gaussian vectors and blind source separation of non-Gaussian sources.
Problem

Research questions and friction points this paper is trying to address.

Analyzing stability of Gaussian vector characterization via equivalence classes
Establishing near-Gaussian properties under approximate independence conditions
Proving stability theorems for differential entropy and blind source separation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Partition vectors into equivalence classes using matrix factors
Prove near-Gaussian sums when independence condition met
Establish stability theorems using Kac-Bernstein and Cramér tools
🔎 Similar Papers
No similar papers found.
M
Mahdi Mahvari
School of Computation, Information, and Technology, Technical University of Munich, Germany
Gerhard Kramer
Gerhard Kramer
Technical University of Munich
Information TheoryCommunication TheoryOptical Communications