π€ AI Summary
For the Traveling Salesman Problem (TSP), this paper introduces DEITSPβa novel, efficient non-autoregressive diffusion model specifically designed for TSP. Departing from sequential generation paradigms, DEITSP pioneers a single-step controllable discrete noise diffusion mechanism, integrated with self-consistency enhancement and alternating denoising/noising iterations. It employs a dual-modality graph Transformer to jointly encode node-edge representations and introduces a progressive noise scheduling framework to jointly optimize solution quality and inference efficiency. Evaluated on both real-world and large-scale TSP benchmarks, DEITSP consistently outperforms existing neural solvers, achieving state-of-the-art performance in solution quality, inference latency, and generalization across unseen problem sizes. By unifying discrete diffusion with combinatorial structure modeling, DEITSP establishes a scalable, robust diffusion-based paradigm for combinatorial optimization.
π Abstract
Recent advances in neural models have shown considerable promise in solving Traveling Salesman Problems (TSPs) without relying on much hand-crafted engineering. However, while non-autoregressive (NAR) approaches benefit from faster inference through parallelism, they typically deliver solutions of inferior quality compared to autoregressive ones. To enhance the solution quality while maintaining fast inference, we propose DEITSP, a diffusion model with efficient iterations tailored for TSP that operates in a NAR manner. Firstly, we introduce a one-step diffusion model that integrates the controlled discrete noise addition process with self-consistency enhancement, enabling optimal solution prediction through simultaneous denoising of multiple solutions. Secondly, we design a dual-modality graph transformer to bolster the extraction and fusion of features from node and edge modalities, while further accelerating the inference with fewer layers. Thirdly, we develop an efficient iterative strategy that alternates between adding and removing noise to improve exploration compared to previous diffusion methods. Additionally, we devise a scheduling framework to progressively refine the solution space by adjusting noise levels, facilitating a smooth search for optimal solutions. Extensive experiments on real-world and large-scale TSP instances demonstrate that DEITSP performs favorably against existing neural approaches in terms of solution quality, inference latency, and generalization ability. Our code is available at $href{https://github.com/DEITSP/DEITSP}{https://github.com/DEITSP/DEITSP}$.