Graph Random Features for Scalable Gaussian Processes

📅 2025-09-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Scalable Gaussian process (GP) Bayesian optimization on large-scale graphs faces prohibitive $O(N^3)$ computational complexity with exact kernel methods, rendering it infeasible for million-node graphs. To address this, we propose an efficient approximation framework based on Graph Random Features (GRFs), which constructs scalable random feature mappings over discrete graph node spaces. This transforms GP inference into a low-dimensional linear model, reducing both time and space complexity to $O(N^{3/2})$. The method enables end-to-end Bayesian optimization on graphs with over one million nodes using single-chip deployment, achieving substantial gains in computational speed and memory efficiency while preserving prediction accuracy and optimization performance comparable to baseline methods. Our key contribution is the first systematic integration of GRFs into GP approximation for discrete, structured inputs—establishing a scalable, theoretically grounded, and practically viable paradigm for large-scale graph-based Bayesian modeling.

Technology Category

Application Category

📝 Abstract
We study the application of graph random features (GRFs) - a recently introduced stochastic estimator of graph node kernels - to scalable Gaussian processes on discrete input spaces. We prove that (under mild assumptions) Bayesian inference with GRFs enjoys $O(N^{3/2})$ time complexity with respect to the number of nodes $N$, compared to $O(N^3)$ for exact kernels. Substantial wall-clock speedups and memory savings unlock Bayesian optimisation on graphs with over $10^6$ nodes on a single computer chip, whilst preserving competitive performance.
Problem

Research questions and friction points this paper is trying to address.

Scalable Gaussian processes on discrete input spaces
Reducing time complexity from O(N^3) to O(N^{3/2})
Enabling Bayesian optimization on million-node graphs
Innovation

Methods, ideas, or system contributions that make the work stand out.

Graph random features for scalable Gaussian processes
Achieves O(N^{3/2}) time complexity for inference
Enables Bayesian optimization on million-node graphs
🔎 Similar Papers
No similar papers found.