Graph Neural Networks vs Convolutional Neural Networks for Graph Domination Number Prediction

📅 2025-11-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Computing the graph domination number—an NP-hard combinatorial invariant—is computationally prohibitive for large graphs. Method: This paper proposes an end-to-end graph neural network (GNN)-based regression framework that directly models graph structure and message-passing dynamics, bypassing conventional CNN approaches that rely on adjacency matrix representations. Contribution/Results: Evaluated on 2,000 random graphs with up to 64 vertices, the model achieves an R² score of 0.987 and a mean absolute error (MAE) of 0.372, significantly outperforming CNN baselines. Inference is over 200× faster than exact algorithms. These results demonstrate that GNNs serve as highly effective, generalizable surrogate models for NP-hard graph invariants, establishing a novel machine learning paradigm for approximating combinatorially intractable graph parameters.

Technology Category

Application Category

📝 Abstract
We investigate machine learning approaches to approximating the emph{domination number} of graphs, the minimum size of a dominating set. Exact computation of this parameter is NP-hard, restricting classical methods to small instances. We compare two neural paradigms: Convolutional Neural Networks (CNNs), which operate on adjacency matrix representations, and Graph Neural Networks (GNNs), which learn directly from graph structure through message passing. Across 2,000 random graphs with up to 64 vertices, GNNs achieve markedly higher accuracy ($R^2=0.987$, MAE $=0.372$) than CNNs ($R^2=0.955$, MAE $=0.500$). Both models offer substantial speedups over exact solvers, with GNNs delivering more than $200 imes$ acceleration while retaining near-perfect fidelity. Our results position GNNs as a practical surrogate for combinatorial graph invariants, with implications for scalable graph optimization and mathematical discovery.
Problem

Research questions and friction points this paper is trying to address.

Predicting graph domination number using neural networks
Comparing GNN and CNN performance on graph invariants
Developing efficient approximations for NP-hard graph problems
Innovation

Methods, ideas, or system contributions that make the work stand out.

GNNs directly learn from graph structure via message passing
CNNs operate on adjacency matrix representations of graphs
GNNs achieve higher accuracy than CNNs for domination number prediction
🔎 Similar Papers
No similar papers found.
Randy Davila
Randy Davila
Rice University
Graph Theory and Combinatorics
B
Beyzanur Ispir
Department of Computational Applied Mathematics & Operations Research, Rice University, Houston, USA