Hierarchical Implicit Neural Emulators

📅 2025-06-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Neural PDE solvers suffer from error accumulation, poor long-term stability, and physical inconsistency in extended-time forecasting. To address these limitations, this work proposes the Multi-scale Implicit Neural Simulator (MINS). MINS introduces a hierarchical implicit conditioning mechanism that dynamically adjusts temporal downsampling rates to jointly model multi-granularity dynamics; it guides fine-grained one-step predictions using hierarchically compressed future-state representations and embeds implicit time-stepping stability—previously unexplored in neural PDE architectures—directly into the model design. The framework integrates implicit neural networks, multi-scale temporal compression, hierarchical conditional prediction, and physics-guided self-supervised training. On turbulent flow simulation, MINS achieves an 18% improvement in short-term accuracy, reduces hundred-step prediction error by 3.2×, preserves physically stable long-term trajectories, and incurs less than 5% additional computational overhead.

Technology Category

Application Category

📝 Abstract
Neural PDE solvers offer a powerful tool for modeling complex dynamical systems, but often struggle with error accumulation over long time horizons and maintaining stability and physical consistency. We introduce a multiscale implicit neural emulator that enhances long-term prediction accuracy by conditioning on a hierarchy of lower-dimensional future state representations. Drawing inspiration from the stability properties of numerical implicit time-stepping methods, our approach leverages predictions several steps ahead in time at increasing compression rates for next-timestep refinements. By actively adjusting the temporal downsampling ratios, our design enables the model to capture dynamics across multiple granularities and enforce long-range temporal coherence. Experiments on turbulent fluid dynamics show that our method achieves high short-term accuracy and produces long-term stable forecasts, significantly outperforming autoregressive baselines while adding minimal computational overhead.
Problem

Research questions and friction points this paper is trying to address.

Enhances long-term prediction accuracy in neural PDE solvers
Addresses error accumulation and stability in dynamical systems
Improves temporal coherence across multiple granularities in forecasts
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multiscale implicit neural emulator enhances prediction
Leverages ahead predictions at increasing compression
Adjusts temporal ratios for multi-granularity dynamics
🔎 Similar Papers
No similar papers found.