Generalizing Dynamics Modeling More Easily from Representation Perspective

📅 2026-03-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limited generalization of existing dynamical modeling approaches, which typically require system-specific modeling. To overcome this limitation, the paper introduces PDEDER, a pre-trained dynamics encoder that, for the first time, adapts the pre-training paradigm to dynamical system modeling. PDEDER employs neural ordinary differential equations (Neural ODEs) to construct a latent representation of system dynamics and jointly optimizes reconstruction, prediction objectives, and Lyapunov exponent constraints to ensure a well-structured and stable latent space. Evaluated across twelve diverse dynamical systems, PDEDER significantly outperforms baseline methods, demonstrating exceptional performance and strong generalization capabilities in both short- and long-term predictions within and across domains.

Technology Category

Application Category

📝 Abstract
Learning system dynamics from observations is a critical problem in many applications over various real-world complex systems, e.g., climate, ecology, and fluid systems. Recently, neural dynamics modeling method have become a prevalent solution that embeds the object's observations into a latent space before learning dynamics using neural methods such as neural Ordinary Differential Equations (ODE). Existing dynamics modeling methods induce a specific model for each observation of different complex systems, resulting in poor generalization across systems. Inspired by the great success of pre-trained models, we conduct a generalized Pre-trained Dynamics EncoDER (PDEDER) which can embed the original state observations into a latent space where the dynamics can be captured more easily. To conduct the generalized PDEDER, we pre-train any Pre-trained Language Model (PLM) by minimizing the Lyapunov exponent objective, which constrains the chaotic behavior of governing dynamics learned in the latent space. By penalizing the divergence of embedded observations, our PDEDER promotes locally stable and well-structured latent dynamics, thereby facilitating more effective dynamics modeling than in the original observation space. In addition, we incorporate reconstruction and forecasting objectives to mitigate the risk of obtaining an over-smoothed latent space. Specifically, we collect 152 sets of real-world and synthetic observations from 23 complex systems as pre-training corpora and employ them to pre-train PDEDER. Given any future dynamic observation, we can fine-tune PDEDER with any specific dynamics modeling method. We evaluate PDEDER on 12 dynamic systems by short/long-term forecasting under both in-domain and cross-domain settings, and the empirical results indicate the effectiveness and generalizability of PDEDER.
Problem

Research questions and friction points this paper is trying to address.

dynamics modeling
generalization
complex systems
latent space
neural ODE
Innovation

Methods, ideas, or system contributions that make the work stand out.

pre-trained dynamics modeling
latent space representation
Lyapunov exponent regularization
cross-system generalization
neural ODE
🔎 Similar Papers
No similar papers found.
Yiming Wang
Yiming Wang
School of Chemical Engineering, East China University of Science and Technology
lifelike soft materialsnon-equilibrium materialssupramolecular self-assembly
Z
Zhengnan Zhang
College of Computer Science and Technology, and Key Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education, Jilin University, China
G
Genghe Zhang
College of Computer Science and Technology, and Key Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education, Jilin University, China
J
Jiawen Dan
College of Software and Key Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education, Jilin University, China
Changchun Li
Changchun Li
Jilin University
Text ClassificationTopic ModelingWeakly Supervised LearningPartial Label LearningSemi-supervised Learning
C
Chenlong Hu
College of Software and Key Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education, Jilin University, China
Chris Nugent
Chris Nugent
Ulster University
Ambient Assisted LivingSmart HomesSmart EnvironmentsActivity Recognition
Jun Liu
Jun Liu
Ulster University
Artificial Intelligencedecision sciencelogicrisk assessmentcomputing
Ximing Li
Ximing Li
Jilin university, China; RIKEN AIP, Japan
Weakly-supervised learningMisinformation analysis
B
Bo Yang
College of Computer Science and Technology, and Key Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education, Jilin University, China