🤖 AI Summary
Exponential growth in scientific data has outpaced network bandwidth, storage capacity, and analytical capabilities, necessitating lossy compression. However, existing research lacks application-specific fidelity requirements aligned with scientific discovery goals, leading to a disconnect between algorithm design and real-world needs.
Method: We conduct a systematic survey of nine representative scientific domains—including climate modeling, combustion, and cosmology—to identify cross-cutting quality constraints (e.g., error bounds on key physical quantities), compression ratios, and throughput requirements. We then develop a “discovery-oriented” compression evaluation framework and analyze error-control mechanisms and applicability boundaries of mainstream tools (SZ, ZFP, MGARD).
Contribution/Results: This work delivers the first multi-disciplinary white paper on lossy compression requirements for scientific data, enabling verifiable benchmarking and co-evolution of high-fidelity, high-performance compression toolchains.
📝 Abstract
Increasing data volumes from scientific simulations and instruments (supercomputers, accelerators, telescopes) often exceed network, storage, and analysis capabilities. The scientific community's response to this challenge is scientific data reduction. Reduction can take many forms, such as triggering, sampling, filtering, quantization, and dimensionality reduction. This report focuses on a specific technique: lossy compression. Lossy compression retains all data points, leveraging correlations and controlled reduced accuracy. Quality constraints, especially for quantities of interest, are crucial for preserving scientific discoveries. User requirements also include compression ratio and speed. While many papers have been published on lossy compression techniques and reference datasets are shared by the community, there is a lack of detailed specifications of application needs that can guide lossy compression researchers and developers. This report fills this gap by reporting on the requirements and constraints of nine scientific applications covering a large spectrum of domains (climate, combustion, cosmology, fusion, light sources, molecular dynamics, quantum circuit simulation, seismology, and system logs). The report also details key lossy compression technologies (SZ, ZFP, MGARD, LC, SPERR, DCTZ, TEZip, LibPressio), discussing their history, principles, error control, hardware support, features, and impact. By presenting both application needs and compression technologies, the report aims to inspire new research to fill existing gaps.