🤖 AI Summary
To address the strong data dependency and poor generalization in black-box physical system identification, this paper proposes a physics-informed gradient-based meta-learning framework. It is the first to embed fundamental physical constraints—such as energy conservation and dynamical structure—directly into the meta-learning pipeline to enable rapid adaptation of neural state-space models (NSSMs) to unseen systems. The method employs a modified MAML algorithm combined with subnet fine-tuning, achieving high-fidelity modeling with minimal target-system data (only 1–3 online updates). Experiments on real-world applications—including indoor localization and energy systems—demonstrate substantial improvements: state estimation accuracy increases markedly, and downstream task errors decrease by 32%–47%. This approach overcomes the data bottleneck inherent in conventional single-system modeling, delivering both strong cross-system generalizability and practical engineering deployability.
📝 Abstract
We present a gradient-based meta-learning framework for rapid adaptation of neural state-space models (NSSMs) for black-box system identification. When applicable, we also incorporate domain-specific physical constraints to improve the accuracy of the NSSM. The major benefit of our approach is that instead of relying solely on data from a single target system, our framework utilizes data from a diverse set of source systems, enabling learning from limited target data, as well as with few online training iterations. Through benchmark examples, we demonstrate the potential of our approach, study the effect of fine-tuning subnetworks rather than full fine-tuning, and report real-world case studies to illustrate the practical application and generalizability of the approach to practical problems with physical-constraints. Specifically, we show that the meta-learned models result in improved downstream performance in model-based state estimation in indoor localization and energy systems.