š¤ AI Summary
Conventional diffusion models for graph generation suffer from O(n²) computational complexity in the node space, hindering scalability. Method: This paper proposes GGSD, the first model to jointly integrate graph Laplacian spectral decomposition with denoising diffusion probabilistic modeling, establishing a novel spectral-space diffusion paradigm. GGSD employs spectral truncation for efficient low-dimensional representation and introduces a linear-complexity, permutation-invariant Transformer architecture that supports node feature fusion. Crucially, it generates graph structures directly in the node space while achieving theoretical O(n) complexity. Contribution/Results: Extensive experiments demonstrate that GGSD significantly outperforms state-of-the-art methods on both synthetic and real-world graph datasets, achieving superior trade-offs among generation speed, structural fidelity, and scalability.
š Abstract
In this paper, we present GGSD, a novel graph generative model based on 1) the spectral decomposition of the graph Laplacian matrix and 2) a diffusion process. Specifically, we propose to use a denoising model to sample eigenvectors and eigenvalues from which we can reconstruct the graph Laplacian and adjacency matrix. Using the Laplacian spectrum allows us to naturally capture the structural characteristics of the graph and work directly in the node space while avoiding the quadratic complexity bottleneck that limits the applicability of other diffusion-based methods. This, in turn, is accomplished by truncating the spectrum, which, as we show in our experiments, results in a faster yet accurate generative process, and by designing a novel transformer-based architecture linear in the number of nodes. Our permutation invariant model can also handle node features by concatenating them to the eigenvectors of each node. An extensive set of experiments on both synthetic and real-world graphs demonstrates the strengths of our model against state-of-the-art alternatives.