🤖 AI Summary
Existing GNN-based process behavior prediction models often neglect temporal dynamics and transition semantics, limiting their ability to capture structural dependencies in business processes. To address this, we propose Time-Semantic GNN (TSGNN), a unified graph neural architecture that jointly models prefix subgraphs via GCN and full trajectories via GAT. Specifically, TSGNN: (1) introduces a dynamic time-decay attention mechanism to adaptively construct prediction-oriented temporal windows; (2) encodes transition-type semantics into edge features to mitigate structural ambiguity; and (3) integrates multi-level explainability modules for node-, edge-, and path-level attribution. Evaluated on five benchmark datasets, TSGNN achieves state-of-the-art performance in both Top-k accuracy and DL score, without dataset-specific hyperparameter tuning. It demonstrates strong generalization, robustness, and intrinsic interpretability.
📝 Abstract
Predictive Business Process Monitoring (PBPM) aims to forecast future events in ongoing cases based on historical event logs. While Graph Neural Networks (GNNs) are well suited to capture structural dependencies in process data, existing GNN-based PBPM models remain underdeveloped. Most rely either on short prefix subgraphs or global architectures that overlook temporal relevance and transition semantics. We propose a unified, interpretable GNN framework that advances the state of the art along three key axes. First, we compare prefix-based Graph Convolutional Networks(GCNs) and full trace Graph Attention Networks(GATs) to quantify the performance gap between localized and global modeling. Second, we introduce a novel time decay attention mechanism that constructs dynamic, prediction-centered windows, emphasizing temporally relevant history and suppressing noise. Third, we embed transition type semantics into edge features to enable fine grained reasoning over structurally ambiguous traces. Our architecture includes multilevel interpretability modules, offering diverse visualizations of attention behavior. Evaluated on five benchmarks, the proposed models achieve competitive Top-k accuracy and DL scores without per-dataset tuning. By addressing architectural, temporal, and semantic gaps, this work presents a robust, generalizable, and explainable solution for next event prediction in PBPM.