Random Wavelet Features for Graph Kernel Machines

πŸ“… 2026-02-17
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the challenge of efficiently computing graph kernels for measuring node similarity in large-scale graphs by proposing a spectral method based on random wavelet features. It introduces random features into graph kernel approximation for the first time, particularly tailored for spectrally localized kernels. The approach constructs low-dimensional node embeddings whose inner products effectively approximate the low-rank form of any specified graph kernel, achieving significantly improved approximation accuracy while maintaining theoretical guarantees. Experimental results demonstrate that the resulting embeddings outperform existing methods across multiple graph learning tasks, offering both scalability and expressive power. This provides an efficient and theoretically grounded solution for large-scale graph representation learning.

Technology Category

Application Category

πŸ“ Abstract
Node embeddings map graph vertices into low-dimensional Euclidean spaces while preserving structural information. They are central to tasks such as node classification, link prediction, and signal reconstruction. A key goal is to design node embeddings whose dot products capture meaningful notions of node similarity induced by the graph. Graph kernels offer a principled way to define such similarities, but their direct computation is often prohibitive for large networks. Inspired by random feature methods for kernel approximation in Euclidean spaces, we introduce randomized spectral node embeddings whose dot products estimate a low-rank approximation of any specific graph kernel. We provide theoretical and empirical results showing that our embeddings achieve more accurate kernel approximations than existing methods, particularly for spectrally localized kernels. These results demonstrate the effectiveness of randomized spectral constructions for scalable and principled graph representation learning.
Problem

Research questions and friction points this paper is trying to address.

graph kernel
node embedding
large-scale graphs
kernel approximation
scalable representation learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

random wavelet features
graph kernel approximation
randomized spectral embeddings
node embeddings
scalable graph representation