Learning Time-Aware Causal Representation for Model Generalization in Evolving Domains

📅 2025-06-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the problem that deep models easily learn spurious correlations and suffer degraded generalization under dynamic distribution shifts, this paper proposes the Temporal-Aware Structural Causal Model (TASCM) and Synchronous Causal Disentanglement (SYNC). Our method explicitly separates static causal factors from time-evolving dynamic factors and models causal mechanism drift, enabling temporal-aware causal representation learning. We theoretically prove that the framework converges to the optimal causal predictor for each time domain. Integrating information-theoretic objectives with a sequential VAE architecture, TASCM-SYNC yields compact, interpretable, and cross-domain causal representations. Extensive experiments on synthetic and real-world time-series benchmarks demonstrate that our approach significantly outperforms existing evolving domain generalization methods, achieving superior temporal generalization and robustness against distributional shifts.

Technology Category

Application Category

📝 Abstract
Endowing deep models with the ability to generalize in dynamic scenarios is of vital significance for real-world deployment, given the continuous and complex changes in data distribution. Recently, evolving domain generalization (EDG) has emerged to address distribution shifts over time, aiming to capture evolving patterns for improved model generalization. However, existing EDG methods may suffer from spurious correlations by modeling only the dependence between data and targets across domains, creating a shortcut between task-irrelevant factors and the target, which hinders generalization. To this end, we design a time-aware structural causal model (SCM) that incorporates dynamic causal factors and the causal mechanism drifts, and propose extbf{S}tatic-D extbf{YN}amic extbf{C}ausal Representation Learning ( extbf{SYNC}), an approach that effectively learns time-aware causal representations. Specifically, it integrates specially designed information-theoretic objectives into a sequential VAE framework which captures evolving patterns, and produces the desired representations by preserving intra-class compactness of causal factors both across and within domains. Moreover, we theoretically show that our method can yield the optimal causal predictor for each time domain. Results on both synthetic and real-world datasets exhibit that SYNC can achieve superior temporal generalization performance.
Problem

Research questions and friction points this paper is trying to address.

Address distribution shifts over time for model generalization
Eliminate spurious correlations in evolving domain generalization
Learn time-aware causal representations for dynamic scenarios
Innovation

Methods, ideas, or system contributions that make the work stand out.

Time-aware structural causal model (SCM)
Static-Dynamic causal representation learning (SYNC)
Sequential VAE framework with information-theoretic objectives
🔎 Similar Papers
No similar papers found.
Zhuo He
Zhuo He
Beijing Institute of Technology
Transfer LearningCausal Learning
S
Shuang Li
Independent Researcher, China
W
Wenze Song
Independent Researcher, China
L
Longhui Yuan
Independent Researcher, China
Jian Liang
Jian Liang
Kuaishou Inc.
transfer learninggraph learning
H
Han Li
Kuaishou Technology, China
Kun Gai
Kun Gai
Senior Director & Researcher, Alibaba Group
Machine LearningComputational Advertising