π€ AI Summary
Existing deep generative models struggle to jointly capture non-local dependencies between the topology and edge weights of weighted graphs, while most approaches fail to balance expressive power with scalability. To address this, we propose BiGG-Eβthe first autoregressive generative model capable of modeling the joint distribution of both graph structure and edge weights in weighted graphs. BiGG-E maintains an overall time complexity of $O((n + m)log n)$ while holistically encoding global dependencies across topology and weights. It extends the BiGG framework with a novel sparsified weight-structure co-modeling mechanism, significantly improving both generation efficiency and modeling fidelity. Extensive experiments on multiple benchmark datasets demonstrate that BiGG-E consistently outperforms state-of-the-art methods in distributional fidelity, computational efficiency, and scalability.
π Abstract
Weighted graphs are ubiquitous throughout biology, chemistry, and the social sciences, motivating the development of generative models for abstract weighted graph data using deep neural networks. However, most current deep generative models are either designed for unweighted graphs and are not easily extended to weighted topologies or incorporate edge weights without consideration of a joint distribution with topology. Furthermore, learning a distribution over weighted graphs must account for complex nonlocal dependencies between both the edges of the graph and corresponding weights of each edge. We develop an autoregressive model BiGG-E, a nontrivial extension of the BiGG model, that learns a joint distribution over weighted graphs while still exploiting sparsity to generate a weighted graph with $n$ nodes and $m$ edges in $O((n + m)log n)$ time. Simulation studies and experiments on a variety of benchmark datasets demonstrate that BiGG-E best captures distributions over weighted graphs while remaining scalable and computationally efficient.