Meta-learning Structure-Preserving Dynamics

📅 2025-08-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing structure-preserving dynamical models rely on fixed parameter configurations, requiring explicit physical priors and per-parameter retraining—rendering them ill-suited for multi-query or parameter-varying scenarios. To address this, we propose a modulation-based meta-learning framework that directly maps structure-preserving models to compact latent representations of unknown system parameters, enabling rapid adaptation without explicit optimization or gray-box knowledge. This work introduces, for the first time, computer vision–inspired modulation mechanisms into energy-conserving and dissipative system modeling, integrated with geometric numerical integration to rigorously enforce physical constraints. Our approach achieves significant improvements in few-shot prediction accuracy on standard benchmarks while guaranteeing dynamical stability and cross-parameter-space generalization. It overcomes critical limitations of prior methods—including training instability, poor generalization, and weak scalability—establishing a robust, adaptive paradigm for physics-informed learning.

Technology Category

Application Category

📝 Abstract
Structure-preserving approaches to dynamics modeling have demonstrated great potential for modeling physical systems due to their strong inductive biases that enforce conservation laws and dissipative behavior. However, the resulting models are typically trained for fixed system configurations, requiring explicit knowledge of system parameters as well as costly retraining for each new set of parameters -- a major limitation in many-query or parameter-varying scenarios. Meta-learning offers a potential solution, but existing approaches like optimization-based meta-learning often suffer from training instability or limited generalization capability. Inspired by ideas from computer vision, we introduce a modulation-based meta-learning framework that directly conditions structure-preserving models on compact latent representations of potentially unknown system parameters, avoiding the need for gray-box system knowledge and explicit optimization during adaptation. Through the application of novel modulation strategies to parametric energy-conserving and dissipative systems, we enable scalable and generalizable learning across parametric families of dynamical systems. Experiments on standard benchmark problems demonstrate that our approach achieves accurate predictions in few-shot learning settings, without compromising on the essential physical constraints necessary for dynamical stability and effective generalization performance across parameter space.
Problem

Research questions and friction points this paper is trying to address.

Modeling dynamics for varying system parameters without retraining
Overcoming instability in meta-learning for physical systems
Ensuring structure preservation in scalable dynamical system learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Modulation-based meta-learning for dynamics modeling
Latent representations for unknown system parameters
Novel modulation strategies for scalable learning
🔎 Similar Papers
No similar papers found.
C
Cheng Jing
School of Computing and Augmented Intelligence, Arizona State University
U
Uvini Balasuriya Mudiyanselage
School of Computing and Augmented Intelligence, Arizona State University
W
Woojin Cho
TelePIX Co., Ltd.
Minju Jo
Minju Jo
Yonsei Univ.
INRSciMLTime-series
Anthony Gruber
Anthony Gruber
Senior Member of Technical Staff, Sandia National Laboratories
differential geometrymodel reductionscientific machine learninggeometric mechanics
Kookjin Lee
Kookjin Lee
Arizona State University