LocalEscaper: A Weakly-supervised Framework with Regional Reconstruction for Scalable Neural TSP Solvers

📅 2025-02-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
For large-scale Traveling Salesman Problems (TSP), existing supervised learning approaches rely heavily on abundant high-quality labeled data, while reinforcement learning methods suffer from poor sample efficiency. This paper proposes LocalEscaper, a weakly supervised learning framework that enables training with low-quality labels and synergistically integrates strengths of both supervised and reinforcement learning. Its key contributions are: (1) a novel regional reconstruction strategy that effectively escapes local optima; and (2) a linear-time attention mechanism that significantly enhances scalability. Experiments demonstrate that LocalEscaper achieves state-of-the-art performance on both synthetic and real-world TSP benchmarks. Notably, it is the first method to efficiently solve TSP instances with up to 50,000 cities, striking an unprecedented balance between solution quality and computational efficiency.

Technology Category

Application Category

📝 Abstract
Neural solvers have shown significant potential in solving the Traveling Salesman Problem (TSP), yet current approaches face significant challenges. Supervised learning (SL)-based solvers require large amounts of high-quality labeled data, while reinforcement learning (RL)-based solvers, though less dependent on such data, often suffer from inefficiencies. To address these limitations, we propose LocalEscaper, a novel weakly-supervised learning framework for large-scale TSP. LocalEscaper effectively combines the advantages of both SL and RL, enabling effective training on datasets with low-quality labels. To further enhance solution quality, we introduce a regional reconstruction strategy, which mitigates the problem of local optima, a common issue in existing local reconstruction methods. Additionally, we propose a linear-complexity attention mechanism that reduces computational overhead, enabling the efficient solution of large-scale TSPs without sacrificing performance. Experimental results on both synthetic and real-world datasets demonstrate that LocalEscaper outperforms existing neural solvers, achieving state-of-the-art results. Notably, it sets a new benchmark for scalability and efficiency, solving TSP instances with up to 50,000 cities.
Problem

Research questions and friction points this paper is trying to address.

Weakly-supervised learning for TSP
Regional reconstruction for local optima
Linear-complexity attention mechanism
Innovation

Methods, ideas, or system contributions that make the work stand out.

Weakly-supervised learning framework
Regional reconstruction strategy
Linear-complexity attention mechanism
🔎 Similar Papers
No similar papers found.
J
Junrui Wen
School of Computer Science, Huazhong University of Science and Technology, China
Y
Yifei Li
School of Computer Science, Huazhong University of Science and Technology, China
Bart Selman
Bart Selman
Professor of Computer Science, Cornell University
Artificial Intelligence
K
Kun He
School of Computer Science, Huazhong University of Science and Technology, China