GLAD: Improving Latent Graph Generative Modeling with Simple Quantization

📅 2024-03-25
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
Existing graph generation models suffer from inefficiency in modeling directly in the raw space and difficulty preserving discrete graph structure when operating in continuous latent spaces. To address these challenges, we propose GLAD—the first equivariant discrete latent-space graph generative model. Our approach fundamentally departs from continuous latent-variable assumptions by constructing a strictly discrete, group-equivariant latent space via vector quantization. Second, we introduce the first adaptation of diffusion bridges as a learnable, structure-aware prior tailored to discrete graph latent spaces. Third, we employ group-equivariant neural networks within an end-to-end architecture, enabling global symmetry modeling without requiring graph decomposition. Evaluated on multiple standard graph generation benchmarks, GLAD achieves state-of-the-art performance while simultaneously guaranteeing strict equivariance, exact discreteness, and superior generative fidelity—establishing the first latent-space framework that unifies all three properties.

Technology Category

Application Category

📝 Abstract
Learning graph generative models over latent spaces has received less attention compared to models that operate on the original data space and has so far demonstrated lacklustre performance. We present GLAD a latent space graph generative model. Unlike most previous latent space graph generative models, GLAD operates on a discrete latent space that preserves to a significant extent the discrete nature of the graph structures making no unnatural assumptions such as latent space continuity. We learn the prior of our discrete latent space by adapting diffusion bridges to its structure. By operating over an appropriately constructed latent space we avoid relying on decompositions that are often used in models that operate in the original data space. We present experiments on a series of graph benchmark datasets that demonstrates GLAD as the first equivariant latent graph generative method achieves competitive performance with the state of the art baselines.
Problem

Research questions and friction points this paper is trying to address.

Enhances latent graph generative modeling with quantization.
Operates on discrete latent space preserving graph structures.
Achieves competitive performance with state-of-the-art baselines.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Discrete latent space modeling
Diffusion bridges adaptation
Equivariant graph generation
🔎 Similar Papers
No similar papers found.