🤖 AI Summary
Point process (TPP) models exhibit strong expressive power but suffer from autoregressive sampling, which precludes parallelization and hinders scalability. This work introduces the first speculative sampling framework for parametric TPPs—requiring no model architecture modification or retraining—enabling exact, parallel multi-step event generation. Our method integrates rejection sampling with a lightweight autoregressive predictor: given historical events, it batch-generates candidate event sequences and rigorously validates each via exact probability checks to guarantee sampling fidelity. We theoretically prove that the resulting event distribution is identical to that of the original TPP. Empirical evaluation across multiple real-world datasets demonstrates an average speedup of 3.2–5.8× over standard autoregressive sampling, with zero accuracy loss. This bridges the longstanding gap between high-expressivity TPP modeling and efficient inference.
📝 Abstract
Temporal point processes are powerful generative models for event sequences that capture complex dependencies in time-series data. They are commonly specified using autoregressive models that learn the distribution of the next event from the previous events. This makes sampling inherently sequential, limiting efficiency. In this paper, we propose a novel algorithm based on rejection sampling that enables exact sampling of multiple future values from existing TPP models, in parallel, and without requiring any architectural changes or retraining. Besides theoretical guarantees, our method demonstrates empirical speedups on real-world datasets, bridging the gap between expressive modeling and efficient parallel generation for large-scale TPP applications.