🤖 AI Summary
This study addresses the problem of reliable, training-free, deterministic graph generation under a prescribed graph edit distance constraint. The authors propose a constant-depth ReLU neural network architecture grounded in graph edit distance theory, integrated with combinatorial construction techniques, which theoretically guarantees that the generated graph remains within a pre-specified edit distance $d$ from the input graph. This work establishes, for the first time, the existence of a ReLU network with constant depth and size $O(n^2 d)$ capable of exactly generating graphs satisfying the edit distance bound. Empirically, the method successfully produces valid graphs with up to 1,400 vertices and an edit distance limit of 140, substantially outperforming existing data-driven approaches.
📝 Abstract
Generation of graphs constrained by a specified graph edit distance from a source graph is important in applications such as cheminformatics, network anomaly synthesis, and structured data augmentation. Despite the growing demand for such constrained generative models in areas including molecule design and network perturbation analysis, the neural architectures required to provably generate graphs within a bounded graph edit distance remain largely unexplored. In addition, existing graph generative models are predominantly data-driven and depend heavily on the availability and quality of training data, which may result in generated graphs that do not satisfy the desired edit distance constraints. In this paper, we address these challenges by theoretically characterizing ReLU neural networks capable of generating graphs within a prescribed graph edit distance from a given graph. In particular, we show the existence of constant depth and O(n^2 d) size ReLU networks that deterministically generate graphs within edit distance d from a given input graph with n vertices, eliminating reliance on training data while guaranteeing validity of the generated graphs. Experimental evaluations demonstrate that the proposed network successfully generates valid graphs for instances with up to 1400 vertices and edit distance bounds up to 140, whereas baseline generative models fail to generate graphs with the desired edit distance. These results provide a theoretical foundation for constructing compact generative models with guaranteed validity.