๐ค AI Summary
This study addresses the challenge of modeling asynchronous event sequences, focusing on capturing long-range dependencies, forecasting future events, discovering implicit causal relationships, and enhancing model interpretability. We propose a unified four-module framework comprising a history encoder, a parameterized conditional intensity function, an event relation discovery mechanism, and an optimized learning strategy. A key innovation is the integration of variational inference-based discrete graph structure learning to implicitly estimate Granger causality graphsโthereby strengthening modeling capacity without sacrificing interpretability. Furthermore, we extend both the encoder architecture and the family of intensity functions. Extensive empirical evaluation across diverse benchmark datasets demonstrates significant improvements in event sequence fitting accuracy and predictive performance. Results also validate the effectiveness and generalizability of the learned implicit causal graphs for modeling heterogeneous event relationships.
๐ Abstract
Temporal point process as the stochastic process on continuous domain of time is commonly used to model the asynchronous event sequence featuring with occurrence timestamps. Thanks to the strong expressivity of deep neural networks, they are emerging as a promising choice for capturing the patterns in asynchronous sequences, in the context of temporal point process. In this paper, we first review recent research emphasis and difficulties in modeling asynchronous event sequences with deep temporal point process, which can be concluded into four fields: encoding of history sequence, formulation of conditional intensity function, relational discovery of events and learning approaches for optimization. We introduce most of recently proposed models by dismantling them into the four parts, and conduct experiments by remodularizing the first three parts with the same learning strategy for a fair empirical evaluation. Besides, we extend the history encoders and conditional intensity function family, and propose a Granger causality discovery framework for exploiting the relations among multi-types of events. Because the Granger causality can be represented by the Granger causality graph, discrete graph structure learning in the framework of Variational Inference is employed to reveal latent structures of the graph. Further experiments show that the proposed framework with latent graph discovery can both capture the relations and achieve an improved fitting and predicting performance.