🤖 AI Summary
This work addresses the challenge of unifying the modeling of positive (attractive) and negative (repulsive) dependencies in random subset generation. We propose the Discrete Kernel Point Process (DKPP)—the first differentiable, learnable family of discrete point processes that subsumes both determinantal point processes (DPPs) and a broad class of Boltzmann machines. DKPP constructs discrete point processes via kernel functions, enabling continuous, explicit control over dependency strength. It supports differentiable marginal and conditional probability computation, as well as unbiased gradient estimation for parameters. Theoretically grounded and empirically validated, DKPP achieves precise, interpretable control over dependency type and magnitude. Experiments demonstrate its efficiency, training stability, and strong generalization across probabilistic inference and parameter learning tasks—establishing a novel paradigm for structured random subset modeling.
📝 Abstract
Positive and negative dependence are fundamental concepts that characterize the attractive and repulsive behavior of random subsets. Although some probabilistic models are known to exhibit positive or negative dependence, it is challenging to seamlessly bridge them with a practicable probabilistic model. In this study, we introduce a new family of distributions, named the discrete kernel point process (DKPP), which includes determinantal point processes and parts of Boltzmann machines. We also develop some computational methods for probabilistic operations and inference with DKPPs, such as calculating marginal and conditional probabilities and learning the parameters. Our numerical experiments demonstrate the controllability of positive and negative dependence and the effectiveness of the computational methods for DKPPs.