🤖 AI Summary
This work addresses the challenge of generating graphs with discrete node and edge attributes that must jointly satisfy structural coupling constraints. Existing methods often rely on factorized noise or independent channels, failing to capture the intricate dependencies between nodes and edges, which leads to inconsistent and low-fidelity outputs. To overcome this limitation, the authors propose the Variational Bayesian Flow Network (VBFN), which introduces a sparse precision matrix induced by learned representations to define a structured joint Gaussian variational family. This formulation enables coupled updates of nodes and edges in a single fusion step, explicitly modeling their joint distribution while avoiding label leakage. By circumventing the factorization assumption inherent in conventional Bayesian flow networks, VBFN achieves state-of-the-art performance in both fidelity and diversity on synthetic and molecular graph benchmarks, significantly outperforming existing baselines.
📝 Abstract
Graph generation aims to sample discrete node and edge attributes while satisfying coupled structural constraints. Diffusion models for graphs often adopt largely factorized forward-noising, and many flow-matching methods start from factorized reference noise and coordinate-wise interpolation, so node-edge coupling is not encoded by the generative geometry and must be recovered implicitly by the core network, which can be brittle after discrete decoding. Bayesian Flow Networks (BFNs) evolve distribution parameters and naturally support discrete generation. But classical BFNs typically rely on factorized beliefs and independent channels, which limit geometric evidence fusion. We propose Variational Bayesian Flow Network (VBFN), which performs a variational lifting to a tractable joint Gaussian variational belief family governed by structured precisions. Each Bayesian update reduces to solving a symmetric positive definite linear system, enabling coupled node and edge updates within a single fusion step. We construct sample-agnostic sparse precisions from a representation-induced dependency graph, thereby avoiding label leakage while enforcing node-edge consistency. On synthetic and molecular graph datasets, VBFN improves fidelity and diversity, and surpasses baseline methods.