Scaling Gaussian Process Regression with Full Derivative Observations

📅 2025-05-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the poor scalability of Gaussian process (GP) regression with full-order derivative observations on high-dimensional, large-scale data, this paper proposes DSoftKI. Methodologically, DSoftKI extends the softmax-based interpolation framework of SoftKI to derivative-observed settings for the first time, introducing a direction-aware soft interpolation mechanism that yields differentiable, scalable approximations of the kernel and its first- and second-order derivatives. By jointly optimizing interpolation point placement and derivative-aware kernel approximation, DSoftKI enables efficient modeling. Empirically, on synthetic benchmarks and molecular force field prediction tasks spanning hundreds to thousands of dimensions, DSoftKI significantly improves both modeling accuracy and computational efficiency for derivative-augmented GP regression, thereby overcoming longstanding scalability bottlenecks of conventional GPs in high-dimensional, large-scale, and multi-derivative regimes.

Technology Category

Application Category

📝 Abstract
We present a scalable Gaussian Process (GP) method that can fit and predict full derivative observations called DSoftKI. It extends SoftKI, a method that approximates a kernel via softmax interpolation from learned interpolation point locations, to the setting with derivatives. DSoftKI enhances SoftKI's interpolation scheme to incorporate the directional orientation of interpolation points relative to the data. This enables the construction of a scalable approximate kernel, including its first and second-order derivatives, through interpolation. We evaluate DSoftKI on a synthetic function benchmark and high-dimensional molecular force field prediction (100-1000 dimensions), demonstrating that DSoftKI is accurate and can scale to larger datasets with full derivative observations than previously possible.
Problem

Research questions and friction points this paper is trying to address.

Scaling Gaussian Process Regression with derivative observations
Extending SoftKI to handle derivative data effectively
Enabling scalable kernel approximation for high-dimensional datasets
Innovation

Methods, ideas, or system contributions that make the work stand out.

Extends SoftKI to handle full derivative observations
Enhances interpolation with directional point orientation
Scalable kernel approximation for high-dimensional data