FedDis: A Causal Disentanglement Framework for Federated Traffic Prediction

📅 2026-01-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the tension between model generalization and local adaptability in federated traffic forecasting caused by non-IID data distributions. To resolve this, we propose FedDis, a causal disentanglement framework that, for the first time, introduces causal disentanglement into federated spatiotemporal prediction. FedDis employs a dual-branch architecture to separately model client-specific dynamics and cross-client shared spatiotemporal patterns, enforcing information orthogonality between the two through mutual information minimization. This design enables effective unification of global knowledge transfer and local personalization while preserving data privacy. Extensive experiments on four real-world traffic datasets demonstrate that FedDis significantly outperforms existing methods, achieving state-of-the-art performance in prediction accuracy, computational efficiency, and scalability.

Technology Category

Application Category

📝 Abstract
Federated learning offers a promising paradigm for privacy-preserving traffic prediction, yet its performance is often challenged by the non-identically and independently distributed (non-IID) nature of decentralized traffic data. Existing federated methods frequently struggle with this data heterogeneity, typically entangling globally shared patterns with client-specific local dynamics within a single representation. In this work, we postulate that this heterogeneity stems from the entanglement of two distinct generative sources: client-specific localized dynamics and cross-client global spatial-temporal patterns. Motivated by this perspective, we introduce FedDis, a novel framework that, to the best of our knowledge, is the first to leverage causal disentanglement for federated spatial-temporal prediction. Architecturally, FedDis comprises a dual-branch design wherein a Personalized Bank learns to capture client-specific factors, while a Global Pattern Bank distills common knowledge. This separation enables robust cross-client knowledge transfer while preserving high adaptability to unique local environments. Crucially, a mutual information minimization objective is employed to enforce informational orthogonality between the two branches, thereby ensuring effective disentanglement. Comprehensive experiments conducted on four real-world benchmark datasets demonstrate that FedDis consistently achieves state-of-the-art performance, promising efficiency, and superior expandability.
Problem

Research questions and friction points this paper is trying to address.

federated learning
traffic prediction
non-IID
data heterogeneity
causal disentanglement
Innovation

Methods, ideas, or system contributions that make the work stand out.

causal disentanglement
federated learning
spatial-temporal prediction
non-IID data
mutual information minimization
🔎 Similar Papers
No similar papers found.