CPformer -- Concept and Physics enhanced Transformer for Time Series Forecasting

📅 2025-08-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Addressing the challenge of simultaneously achieving accuracy, interpretability, and physical consistency in multivariate time series forecasting—while overcoming limited generalization caused by domain-specific statistical heterogeneity—this paper proposes PhysCon: the first framework integrating concept disentanglement learning with physics-first principles into a Transformer architecture. Methodologically, PhysCon fuses five domain-agnostic self-supervised concept representations and introduces a differentiable physics-based residual constraint module to enable scientifically grounded, transparent forecasting; it further supports multi-scale modeling to jointly capture long-range dependencies and physical plausibility. Evaluated on six benchmark datasets across eight metrics, PhysCon achieves state-of-the-art performance: MSE reductions of 23%, 44%, and 61% over FEDformer on Electricity, Traffic, and Illness datasets, respectively. The framework significantly advances prediction accuracy, model interpretability, and cross-domain generalization capability.

Technology Category

Application Category

📝 Abstract
Accurate, explainable and physically-credible forecasting remains a persistent challenge for multivariate time-series whose statistical properties vary across domains. We present CPformer, a Concept- and Physics-enhanced Transformer that channels every prediction through five self-supervised, domain-agnostic concepts while enforcing differentiable residuals drawn from first-principle constraints. Unlike prior efficiency-oriented Transformers that rely purely on sparsity or frequency priors , CPformer combines latent transparency with hard scientific guidance while retaining attention for long contexts. We tested CPformer on six publicly-available datasets: sub-hourly Electricity and Traffic, hourly ETT, high-dimensional Weather, weekly Influenza-like Illness, and minute-level Exchange Rate, and CPformer achieves the lowest error in eight of twelve MSE/MAE cells. Relative to the strongest Transformer baseline (FEDformer), CPformer reduces mean-squared-error by 23% on Electricity, 44% on Traffic and 61% on Illness, while matching performance on strictly periodic Weather and ETT series.
Problem

Research questions and friction points this paper is trying to address.

Improving accuracy and explainability in multivariate time-series forecasting
Integrating domain-agnostic concepts with physics-based constraints
Retaining long-context attention while ensuring physical credibility
Innovation

Methods, ideas, or system contributions that make the work stand out.

Concept-enhanced Transformer for explainable forecasting
Physics-guided residuals from first-principle constraints
Self-supervised domain-agnostic concepts integration
🔎 Similar Papers
No similar papers found.