🤖 AI Summary
Existing structure-preserving dynamical models rely on fixed parameter configurations, requiring explicit physical priors and per-parameter retraining—rendering them ill-suited for multi-query or parameter-varying scenarios. To address this, we propose a modulation-based meta-learning framework that directly maps structure-preserving models to compact latent representations of unknown system parameters, enabling rapid adaptation without explicit optimization or gray-box knowledge. This work introduces, for the first time, computer vision–inspired modulation mechanisms into energy-conserving and dissipative system modeling, integrated with geometric numerical integration to rigorously enforce physical constraints. Our approach achieves significant improvements in few-shot prediction accuracy on standard benchmarks while guaranteeing dynamical stability and cross-parameter-space generalization. It overcomes critical limitations of prior methods—including training instability, poor generalization, and weak scalability—establishing a robust, adaptive paradigm for physics-informed learning.
📝 Abstract
Structure-preserving approaches to dynamics modeling have demonstrated great potential for modeling physical systems due to their strong inductive biases that enforce conservation laws and dissipative behavior. However, the resulting models are typically trained for fixed system configurations, requiring explicit knowledge of system parameters as well as costly retraining for each new set of parameters -- a major limitation in many-query or parameter-varying scenarios. Meta-learning offers a potential solution, but existing approaches like optimization-based meta-learning often suffer from training instability or limited generalization capability. Inspired by ideas from computer vision, we introduce a modulation-based meta-learning framework that directly conditions structure-preserving models on compact latent representations of potentially unknown system parameters, avoiding the need for gray-box system knowledge and explicit optimization during adaptation. Through the application of novel modulation strategies to parametric energy-conserving and dissipative systems, we enable scalable and generalizable learning across parametric families of dynamical systems. Experiments on standard benchmark problems demonstrate that our approach achieves accurate predictions in few-shot learning settings, without compromising on the essential physical constraints necessary for dynamical stability and effective generalization performance across parameter space.