Model Selection and Parameter Estimation of Multi-dimensional Gaussian Mixture Model

📅 2026-03-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the problem of model order selection and parameter estimation for multidimensional Gaussian mixture models (GMMs). It first establishes an information-theoretic lower bound on the sample complexity required for consistent model selection and proposes a minimax-optimal algorithm that leverages the spectral gap of the covariance matrix derived from random Fourier measurements to determine the number of components. Building upon this, the method integrates PCA-based dimensionality reduction, score-driven initialization, and gradient-based optimization to achieve efficient and accurate parameter estimation. The proposed approach attains the theoretical optimum in sample complexity and achieves a √n convergence rate for parameter estimation. Numerical experiments demonstrate its significant superiority over conventional EM algorithms in both estimation accuracy and computational efficiency.

Technology Category

Application Category

📝 Abstract
In this paper, we study the problem of learning multi-dimensional Gaussian Mixture Models (GMMs), with a specific focus on model order selection and efficient mixing distribution estimation. We first establish an information-theoretic lower bound on the critical sample complexity required for reliable model selection. More specifically, we show that distinguishing a $k$-component mixture from a simpler model necessitates a sample size scaling of $Ω(Δ^{-(4k-4)})$. We then propose a thresholding-based estimation algorithm that evaluates the spectral gap of an empirical covariance matrix constructed from random Fourier measurement vectors. This parameter-free estimator operates with an efficient time complexity of $\mathcal{O}(k^2 n)$, scaling linearly with the sample size. We demonstrate that the sample complexity of our method matches the established lower bound, confirming its minimax optimality with respect to the component separation distance $Δ$. Conditioned on the estimated model order, we subsequently introduce a gradient-based minimization method for parameter estimation. To effectively navigate the non-convex objective landscape, we employ a data-driven, score-based initialization strategy that guarantees rapid convergence. We prove that this method achieves the optimal parametric convergence rate of $\mathcal{O}_p(n^{-1/2})$ for estimating the component means. To enhance the algorithm's efficiency in high-dimensional regimes where the ambient dimension exceeds the number of mixture components (i.e., \(d > k\)), we integrate principal component analysis (PCA) for dimension reduction. Numerical experiments demonstrate that our Fourier-based algorithmic framework outperforms conventional Expectation-Maximization (EM) methods in both estimation accuracy and computational time.
Problem

Research questions and friction points this paper is trying to address.

Gaussian Mixture Model
Model Selection
Parameter Estimation
Sample Complexity
High-dimensional Statistics
Innovation

Methods, ideas, or system contributions that make the work stand out.

Gaussian Mixture Model
model selection
sample complexity
spectral gap
Fourier-based estimation
🔎 Similar Papers
No similar papers found.