Bayesian Mixtures Models with Repulsive and Attractive Atoms

📅 2023-02-17
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
Existing Bayesian mixture models assume independence among atoms of random probability measures, neglecting inter-atomic interactions—such as repulsion or attraction—that are critical for meaningful clustering. Method: We propose the first unified framework enabling both repulsive and attractive interactions among atoms, abandoning the classical independence assumption. We derive closed-form expressions for the posterior, marginal, and predictive distributions of interacting-atom random measures; avoid prespecifying finite point process structures; and pioneer the integration of Palm calculus into Bayesian nonparametric inference. Our hierarchical model combines Poisson, Gibbs, and determinantal point processes with shot-noise Cox processes to flexibly encode interaction patterns. Contribution/Results: The framework supports efficient MCMC sampling, ensures prior interpretability and algorithmic convergence, and significantly improves cluster separation and interpretability. Extensive experiments on synthetic and real-world datasets validate its effectiveness and robustness.
📝 Abstract
The study of almost surely discrete random probability measures is an active line of research in Bayesian nonparametrics. The idea of assuming interaction across the atoms of the random probability measure has recently spurred significant interest in the context of Bayesian mixture models. This allows the definition of priors that encourage well-separated and interpretable clusters. In this work, we provide a unified framework for the construction and the Bayesian analysis of random probability measures with interacting atoms, encompassing both repulsive and attractive behaviours. Specifically, we derive closed-form expressions for the posterior distribution, the marginal and predictive distributions, which were not previously available except for the case of measures with i.i.d. atoms. We show how these quantities are fundamental both for prior elicitation and to develop new posterior simulation algorithms for hierarchical mixture models. Our results are obtained without any assumption on the finite point process that governs the atoms of the random measure. Their proofs rely on analytical tools borrowed from the Palm calculus theory, which might be of independent interest. We specialise our treatment to the classes of Poisson, Gibbs, and determinantal point processes, as well as in the case of shot-noise Cox processes. Finally, we illustrate the performance of different modelling strategies on simulated and real datasets.
Problem

Research questions and friction points this paper is trying to address.

Develop Bayesian mixture models with interacting atoms
Derive posterior and predictive distributions analytically
Apply framework to Poisson, Gibbs, and determinantal processes
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bayesian mixture models with interacting atoms
Closed-form posterior and predictive distributions
Palm calculus for Poisson, Gibbs, determinantal processes
Mario Beraha
Mario Beraha
Department of Economics, Management and Statistics, University of Milano-Bicocca
Bayesian statisticsBayesian nonparametricsWasserstein metric
R
R. Argiento
Department of Economics, University of Bergamo
F
F. Camerlenghi
Department of Economics, Management and Statistics, University of Milano-Bicocca
A
A. Guglielmi
Department of Mathematics, Politecnico di Milano