π€ AI Summary
This work addresses the challenge that continuously evolving dynamic networks pose for link prediction models, which often struggle to adapt to complex temporal dynamics and lack robustness to emerging structural patterns. To this end, we propose CoDCL, the first plug-and-play general framework that integrates counterfactual data augmentation into dynamic graph representation learning. CoDCL generates high-quality counterfactual samples through dynamic intervention design and structural neighborhood exploration, and leverages contrastive learning to enhance the modelβs temporal adaptability. Notably, our approach requires no modification to existing architectures and consistently outperforms state-of-the-art baselines across multiple real-world dynamic network datasets, thereby demonstrating the critical role of counterfactual augmentation in dynamic link prediction.
π Abstract
The rapid growth and continuous structural evolution of dynamic networks make effective predictions increasingly challenging. To enable prediction models to adapt to complex temporal environments, they need to be robust to emerging structural changes. We propose a dynamic network learning framework CoDCL, which combines counterfactual data augmentation with contrastive learning to address this deficiency.Furthermore, we devise a comprehensive strategy to generate high-quality counterfactual data, combining a dynamic treatments design with efficient structural neighborhood exploration to quantify the temporal changes in interaction patterns.Crucially, the entire CoDCL is designed as a plug-and-play universal module that can be seamlessly integrated into various existing temporal graph models without requiring architectural modifications.Extensive experiments on multiple real-world datasets demonstrate that CoDCL significantly gains state-of-the-art baseline models in the field of dynamic networks, confirming the critical role of integrating counterfactual data augmentation into dynamic representation learning.