In-Context Learning of Temporal Point Processes with Foundation Inference Models

📅 2025-09-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing marked temporal point process (MTPP) modeling relies on task-specific architectures trained separately for each application domain, resulting in poor generalizability and high deployment overhead. To address this, we propose FIM-PP—the first foundation inference model for MTPPs grounded in in-context learning. FIM-PP is pre-trained on large-scale synthetic data generated from Hawkes processes and uniquely integrates amortized inference with in-context learning into temporal point process modeling. Crucially, it eliminates the need for task-specific architectural modifications and supports both zero-shot transfer and efficient lightweight fine-tuning. Extensive evaluation across multiple standard benchmarks demonstrates that FIM-PP achieves performance on par with dedicated models—either in zero-shot settings or with minimal fine-tuning—thereby substantially enhancing model universality, adaptability, and practical utility in real-world MTPP applications.

Technology Category

Application Category

📝 Abstract
Modeling event sequences of multiple event types with marked temporal point processes (MTPPs) provides a principled way to uncover governing dynamical rules and predict future events. Current neural network approaches to MTPP inference rely on training separate, specialized models for each target system. We pursue a radically different approach: drawing on amortized inference and in-context learning, we pretrain a deep neural network to infer, in-context, the conditional intensity functions of event histories from a context defined by sets of event sequences. Pretraining is performed on a large synthetic dataset of MTPPs sampled from a broad distribution of Hawkes processes. Once pretrained, our Foundation Inference Model for Point Processes (FIM-PP) can estimate MTPPs from real-world data without any additional training, or be rapidly finetuned to target systems. Experiments show that this amortized approach matches the performance of specialized models on next-event prediction across common benchmark datasets. Our pretrained model, repository and tutorials will soon be available online
Problem

Research questions and friction points this paper is trying to address.

Modeling event sequences with marked temporal point processes
Inferring conditional intensity functions from event histories
Estimating temporal point processes without retraining for new systems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Pretrains deep neural network for in-context learning
Uses amortized inference on synthetic Hawkes processes
Enables zero-shot estimation without additional training
🔎 Similar Papers