🤖 AI Summary
This work addresses the challenge of exascale spatiotemporal data—ranging from petabytes to exabytes—generated by high-dimensional partial differential equation simulations, which severely strains storage efficiency in high-performance computing. To mitigate this bottleneck, the authors propose an end-to-end in situ spatiotemporal compression framework that uniquely integrates adaptive temporal snapshot selection with neural implicit representations for spatial compression within a single streaming pass. By continuously fine-tuning a neural field to learn residuals, the method achieves joint compression while preserving high physical fidelity. This approach reduces storage overhead by several orders of magnitude and substantially alleviates reliance on disk-based storage systems.
📝 Abstract
The persistent storage requirements for high-resolution, spatiotemporally evolving fields governed by large-scale and high-dimensional partial differential equations (PDEs) have reached the petabyte-to-exabyte scale. Transient simulations modeling Navier-Stokes equations, magnetohydrodynamics, plasma physics, or binary black hole mergers generate data volumes that are prohibitive for modern high-performance computing (HPC) infrastructures. To address this bottleneck, we introduce ANTIC (Adaptive Neural Temporal in situ Compressor), an end-to-end in situ compression pipeline. ANTIC consists of an adaptive temporal selector tailored to high-dimensional physics that identifies and filters informative snapshots at simulation time, combined with a spatial neural compression module based on continual fine-tuning that learns residual updates between adjacent snapshots using neural fields. By operating in a single streaming pass, ANTIC enables a combined compression of temporal and spatial components and effectively alleviates the need for explicit on-disk storage of entire time-evolved trajectories. Experimental results demonstrate how storage reductions of several orders of magnitude relate to physics accuracy.