Repulsive Mixture Model with Projection Determinantal Point Process

📅 2025-10-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In Bayesian mixture models, independently specified priors on cluster centers often induce component overlap, undermining interpretability; existing repulsive priors suffer from computational intractability and theoretical intractability. This paper proposes a repulsive mixture model based on the projection determinantal point process (DPP), the first fully tractable and exactly samplable repulsive prior. We derive closed-form posterior and predictive distributions analytically, design efficient conditional Gibbs and marginal sampling algorithms, and establish frequentist consistency guarantees. Experiments demonstrate that our model significantly outperforms existing methods under model misspecification and successfully identifies neurocognitively meaningful, interpretable subgroups in event-related potential data.

Technology Category

Application Category

📝 Abstract
In many scientific domains, clustering aims to reveal interpretable latent structure that reflects relevant subpopulations or processes. Widely used Bayesian mixture models for model-based clustering often produce overlapping or redundant components because priors on cluster locations are specified independently, hindering interpretability. To mitigate this, repulsive priors have been proposed to encourage well-separated components, yet existing approaches face both computational and theoretical challenges. We introduce a fully tractable Bayesian repulsive mixture model by assigning a projection Determinantal Point Process (DPP) prior to the component locations. Projection DPPs induce strong repulsion and allow exact sampling, enabling parsimonious and interpretable posterior clustering. Leveraging their analytical tractability, we derive closed-form posterior and predictive distributions. These results, in turn, enable two efficient inference algorithms: a conditional Gibbs sampler and the first fully implementable marginal sampler for DPP-based mixtures. We also provide strong frequentist guarantees, including posterior consistency for density estimation, elimination of redundant components, and contraction of the mixing measure. Simulation studies confirm superior mixing and clustering performance compared to alternatives in misspecified settings. Finally, we demonstrate the utility of our method on event-related potential functional data, where it uncovers interpretable neuro-cognitive subgroups. Our results support the projection DPP mixtures as a theoretically sound and practically effective solution for Bayesian clustering.
Problem

Research questions and friction points this paper is trying to address.

Develops tractable Bayesian repulsive mixture model
Addresses overlapping components in clustering interpretability
Enables exact sampling and efficient inference algorithms
Innovation

Methods, ideas, or system contributions that make the work stand out.

Projection DPP prior induces strong repulsion
Enables exact sampling and closed-form distributions
Provides efficient Gibbs and marginal samplers
🔎 Similar Papers
No similar papers found.