Neural Tractability via Structure: Learning-Augmented Algorithms for Graph Combinatorial Optimization

📅 2025-11-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Neural models for NP-hard graph combinatorial optimization suffer from suboptimal solution quality and poor out-of-distribution generalization. Method: We propose a neural-search co-design framework: (i) parameterized structural analysis identifies tractable and intractable substructures; (ii) a lightweight neural model generates only exploratory guidance signals (e.g., node priorities or pruning suggestions), which steer structure-aware exact search algorithms toward high-yield regions. This decouples learning from solving, eliminating end-to-end fitting bias while preserving neural efficiency and combinatorial algorithmic guarantees. Contribution/Results: Experiments across multiple graph optimization tasks show solutions approaching those of commercial solvers (e.g., Gurobi) in quality, with significantly stronger out-of-distribution generalization than purely neural approaches. The framework is broadly applicable, interpretable, and bridges the gap between data-driven heuristics and rigorous combinatorial optimization.

Technology Category

Application Category

📝 Abstract
Neural models have shown promise in solving NP-hard graph combinatorial optimization (CO) problems. Once trained, they offer fast inference and reasonably high-quality solutions for in-distribution testing instances, but they generally fall short in terms of absolute solution quality compared to classical search-based algorithms that are admittedly slower but offer optimality guarantee once search finishes. We propose a novel framework that combines the inference efficiency and exploratory power of neural models with the solution quality guarantee of search-based algorithms. In particular, we use parameterized algorithms (PAs) as the search component. PAs are dedicated to identifying easy instances of generally NP-hard problems, and allow for practically efficient search by exploiting structural simplicity (of the identified easy instances). Under our framework, we use parameterized analysis to identify the structurally hard parts of a CO instance. The neural model handles the hard parts by generating advisory signals based on its data-driven understanding. The PA-based search component then integrates the advisory signals to systematically and efficiently searches through the remaining structurally easy parts. Notably, our framework is agnostic to the choice of neural model and produces strictly better solutions than neural solvers alone. We examine our framework on multiple CO tasks. Empirical results show that it achieves superior solution quality, competitive with that of commercial solvers. Furthermore, by using the neural model only for exploratory advisory signals, our framework exhibits improved out-of-distribution generalization, addressing a key limitation of existing neural CO solvers.
Problem

Research questions and friction points this paper is trying to address.

Combining neural models with search algorithms for graph optimization problems
Improving solution quality while maintaining neural inference efficiency
Enhancing generalization for out-of-distribution graph combinatorial optimization instances
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combining neural models with parameterized algorithms for optimization
Using neural advisory signals for structurally hard graph parts
Integrating parameterized search on easy graph components
🔎 Similar Papers
No similar papers found.
J
Jialiang Li
School of Computer Science and Mathematical Sciences, The University of Adelaide
Weitong Chen
Weitong Chen
The University of Adelaide
Data MiningMachine LearningHealth Data Analysis
M
Mingyu Guo
School of Computer Science and Mathematical Sciences, The University of Adelaide