DDOT: A Derivative-directed Dual-decoder Ordinary Differential Equation Transformer for Dynamic System Modeling

📅 2025-06-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional symbolic regression struggles to capture temporal dependencies and variable couplings in dynamical systems, while existing ODEFormer approaches exhibit sensitivity to initial conditions and lack robust evaluation. This paper proposes a novel method for ordinary differential equation (ODE) discovery from single-trajectory data. We design a dual-decoder Transformer architecture that jointly optimizes symbolic expression learning and derivative prediction. Furthermore, we introduce DIV-diff—a divergence-based evaluation metric that quantifies model performance via grid-point divergence analysis—yielding more stable and comprehensive assessment. On the ODEBench benchmark, our method improves R² pass rates by 4.58% (reconstruction) and 1.62% (generalization), while reducing DIV-diff by 3.55%. Validation on a real-world anesthesia dataset confirms both interpretability and practical utility of the learned ODE models.

Technology Category

Application Category

📝 Abstract
Uncovering the underlying ordinary differential equations (ODEs) that govern dynamic systems is crucial for advancing our understanding of complex phenomena. Traditional symbolic regression methods often struggle to capture the temporal dynamics and intervariable correlations inherent in ODEs. ODEFormer, a state-of-the-art method for inferring multidimensional ODEs from single trajectories, has made notable progress. However, its focus on single-trajectory evaluation is highly sensitive to initial starting points, which may not fully reflect true performance. To address this, we propose the divergence difference metric (DIV-diff), which evaluates divergence over a grid of points within the target region, offering a comprehensive and stable analysis of the variable space. Alongside, we introduce DDOT (Derivative-Directed Dual-Decoder Ordinary Differential Equation Transformer), a transformer-based model designed to reconstruct multidimensional ODEs in symbolic form. By incorporating an auxiliary task predicting the ODE's derivative, DDOT effectively captures both structure and dynamic behavior. Experiments on ODEBench show DDOT outperforms existing symbolic regression methods, achieving an absolute improvement of 4.58% and 1.62% in $P(R^2 > 0.9)$ for reconstruction and generalization tasks, respectively, and an absolute reduction of 3.55% in DIV-diff. Furthermore, DDOT demonstrates real-world applicability on an anesthesia dataset, highlighting its practical impact.
Problem

Research questions and friction points this paper is trying to address.

Uncovering underlying ODEs governing dynamic systems
Improving symbolic regression for temporal dynamics in ODEs
Enhancing ODE reconstruction accuracy and generalization performance
Innovation

Methods, ideas, or system contributions that make the work stand out.

Transformer-based model for ODE reconstruction
Auxiliary task predicts ODE derivatives
Divergence difference metric for stable analysis
🔎 Similar Papers
No similar papers found.