A Partitioned Sparse Variational Gaussian Process for Fast, Distributed Spatial Modeling

📅 2025-07-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Exascale computing faces a critical I/O bottleneck where storage bandwidth lags far behind computational throughput, hindering post-hoc uncertainty quantification (UQ) for large-scale scientific simulations. To address this, we propose a distributed sparse variational Gaussian process (SVGP) framework tailored for *in situ* statistical analysis. Unlike conventional partitioned modeling—which yields discontinuous response surfaces due to isolated local approximations—our method introduces a lightweight inter-partition communication mechanism that explicitly encodes cross-region spatial dependencies. This preserves node-level parallelism and memory efficiency while ensuring continuity and global consistency of spatial predictions. Evaluated on the Energy Exascale Earth System Model (E3SM), our approach achieves significantly higher predictive accuracy and markedly reduced boundary discontinuities compared to independent per-partition SVGPs, while maintaining strong weak scaling and low communication overhead.

Technology Category

Application Category

📝 Abstract
The next generation of Department of Energy supercomputers will be capable of exascale computation. For these machines, far more computation will be possible than that which can be saved to disk. As a result, users will be unable to rely on post-hoc access to data for uncertainty quantification and other statistical analyses and there will be an urgent need for sophisticated machine learning algorithms which can be trained in situ. Algorithms deployed in this setting must be highly scalable, memory efficient and capable of handling data which is distributed across nodes as spatially contiguous partitions. One suitable approach involves fitting a sparse variational Gaussian process (SVGP) model independently and in parallel to each spatial partition. The resulting model is scalable, efficient and generally accurate, but produces the undesirable effect of constructing discontinuous response surfaces due to the disagreement between neighboring models at their shared boundary. In this paper, we extend this idea by allowing for a small amount of communication between neighboring spatial partitions which encourages better alignment of the local models, leading to smoother spatial predictions and a better fit in general. Due to our decentralized communication scheme, the proposed extension remains highly scalable and adds very little overhead in terms of computation (and none, in terms of memory). We demonstrate this Partitioned SVGP (PSVGP) approach for the Energy Exascale Earth System Model (E3SM) and compare the results to the independent SVGP case.
Problem

Research questions and friction points this paper is trying to address.

Develop scalable spatial modeling for exascale supercomputers
Address discontinuous predictions in partitioned Gaussian processes
Enable efficient inter-partition communication for smoother outputs
Innovation

Methods, ideas, or system contributions that make the work stand out.

Partitioned Sparse Variational Gaussian Process model
Decentralized communication between neighboring partitions
Scalable, memory-efficient in situ training
🔎 Similar Papers
No similar papers found.
M
Michael J. Grosskopf
Statistical Sciences Group, Los Alamos National Laboratory
Kellin Rumsey
Kellin Rumsey
Los Alamos National Laboratory
Uncertainty QuantificationBayesian statistics
A
Ayan Biswas
Information Sciences Group, Los Alamos National Laboratory
Earl Lawrence
Earl Lawrence
Los Alamos National Laboratory
fencingfightingrevengetrue lovemiracles