Data-Augmented Few-Shot Neural Stencil Emulation for System Identification of Computer Models

📅 2025-08-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Neural partial differential equation (PDE) solvers suffer from reliance on costly long-horizon numerical integration, high spatiotemporal redundancy, and poor generalization under scarce data. To address these challenges, this work proposes a few-shot stencil-based data augmentation framework. Instead of sampling full solution trajectories, our method employs space-filling strategies to locally sample “stencil” neighborhoods in state space and synthesizes high-information-density training samples using only ∼10 time steps of simulation data. Crucially, physical constraints are embedded directly into the data generation process via explicit modeling of local differential operator relationships. Experiments across multiple canonical PDE systems demonstrate substantial improvements in identification accuracy and out-of-distribution generalization under data-limited regimes—outperforming conventional trajectory-sampling baselines while achieving significantly higher training efficiency.

Technology Category

Application Category

📝 Abstract
Partial differential equations (PDEs) underpin the modeling of many natural and engineered systems. It can be convenient to express such models as neural PDEs rather than using traditional numerical PDE solvers by replacing part or all of the PDE's governing equations with a neural network representation. Neural PDEs are often easier to differentiate, linearize, reduce, or use for uncertainty quantification than the original numerical solver. They are usually trained on solution trajectories obtained by long time integration of the PDE solver. Here we propose a more sample-efficient data-augmentation strategy for generating neural PDE training data from a computer model by space-filling sampling of local "stencil" states. This approach removes a large degree of spatiotemporal redundancy present in trajectory data and oversamples states that may be rarely visited but help the neural PDE generalize across the state space. We demonstrate that accurate neural PDE stencil operators can be learned from synthetic training data generated by the computational equivalent of 10 timesteps' worth of numerical simulation. Accuracy is further improved if we assume access to a single full-trajectory simulation from the computer model, which is typically available in practice. Across several PDE systems, we show that our data-augmented synthetic stencil data yield better trained neural stencil operators, with clear performance gains compared with naively sampled stencil data from simulation trajectories.
Problem

Research questions and friction points this paper is trying to address.

Efficiently generating neural PDE training data
Reducing spatiotemporal redundancy in trajectory data
Improving neural stencil operator generalization accuracy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Space-filling sampling of local stencil states
Synthetic training from 10 timesteps simulation
Augmented stencil data from single full-trajectory
🔎 Similar Papers
No similar papers found.