🤖 AI Summary
To address the challenge of accurate flow latency prediction in modern communication networks, this paper proposes a three-stage modeling paradigm: “heterogeneous graph neural network → interpretable symbolic surrogate model.” We innovatively integrate Kolmogorov–Arnold Networks (KAN) into message passing and attention mechanisms to construct KAMP-Attn, a heterogeneous GNN tailored for network topology and traffic dynamics. Furthermore, we introduce a graph-structure-preserving closed-form symbolic distillation method that efficiently converts the black-box GNN into a weight-free, analytically tractable symbolic equation. Compared to state-of-the-art approaches, our method achieves comparable or superior prediction accuracy while reducing parameter count by orders of magnitude. It thus enables lightweight deployment and delivers strong model interpretability—offering a novel paradigm for network performance optimization that jointly ensures high fidelity and full transparency.
📝 Abstract
Accurate prediction of flow delay is essential for optimizing and managing modern communication networks. We investigate three levels of modeling for this task. First, we implement a heterogeneous GNN with attention-based message passing, establishing a strong neural baseline. Second, we propose FlowKANet in which Kolmogorov-Arnold Networks replace standard MLP layers, reducing trainable parameters while maintaining competitive predictive performance. FlowKANet integrates KAMP-Attn (Kolmogorov-Arnold Message Passing with Attention), embedding KAN operators directly into message-passing and attention computation. Finally, we distill the model into symbolic surrogate models using block-wise regression, producing closed-form equations that eliminate trainable weights while preserving graph-structured dependencies. The results show that KAN layers provide a favorable trade-off between efficiency and accuracy and that symbolic surrogates emphasize the potential for lightweight deployment and enhanced transparency.