Sampling on Discrete Spaces with Temporal Point Processes

πŸ“… 2026-03-09
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the problem of efficiently sampling from discrete multivariate count distributions with downward-closed support. To this end, the authors propose a novel sampler based on temporal point processes, which approximates the target distribution through vectors of event counts observed within a fixed-length sliding window. Sampling is realized via a coupled infinite-server queueing system, and the method incorporates a discrete momentum mechanism that accommodates both reversible and non-reversible dynamics. Notably, its architecture aligns with biological neural mechanisms, enabling seamless integration into stochastic recurrent neural networks. Theoretical analysis and empirical evaluation across 63 target distributions demonstrate that the proposed approach consistently outperforms classical birth–death processes in terms of multivariate effective sample size, and typically surpasses the Zanella process as well; these advantages become even more pronounced after normalization by CPU time.

Technology Category

Application Category

πŸ“ Abstract
Temporal point processes offer a powerful framework for sampling from discrete distributions, yet they remain underutilized in existing literature. We show how to construct, for any target multivariate count distribution with downward-closed support, a multivariate temporal point process whose event-count vector in a fixed-length sliding window converges in distribution to the target as time tends to infinity. Structured as a system of potentially coupled infinite-server queues with deterministic service times, the sampler exhibits a discrete form of momentum that suppresses random-walk behaviour. The admissible families of processes permit both reversible and non-reversible dynamics. As an application, we derive a recurrent stochastic neural network whose dynamics implement sampling-based computation and exhibit some biologically plausible features, including relative refractory periods and oscillatory dynamics. The introduction of auxiliary randomness reduces the sampler to a birth-death process, establishing the latter as a degenerate case with the same limiting distribution. In simulations on 63 target distributions, our sampler always outperforms these birth-death processes and frequently outperforms Zanella processes in multivariate effective sample size, with further gains when normalized by CPU time.
Problem

Research questions and friction points this paper is trying to address.

discrete sampling
temporal point processes
multivariate count distribution
downward-closed support
stochastic sampling
Innovation

Methods, ideas, or system contributions that make the work stand out.

temporal point processes
discrete sampling
infinite-server queues
stochastic neural networks
non-reversible dynamics
πŸ”Ž Similar Papers
No similar papers found.
C
Cameron A. Stewart
Gatsby Computational Neuroscience Unit, University College London, 25 Howland Street, London W1T 4JG, U.K.
Maneesh Sahani
Maneesh Sahani
Gatsby Unit, UCL
Theoretical Neuroscience and Machine Learning