🤖 AI Summary
This work addresses the limitations of existing prediction-and-search methods for mixed-integer linear programming (MILP), which typically assume variable independence and produce only deterministic point predictions, leading to insufficient solution diversity and high search costs. The authors propose the SRG framework, which uniquely integrates Lagrangian relaxation with score-based generative models. By employing convolutional kernels to capture variable dependencies and leveraging Lagrangian relaxation–guided stochastic differential equations, SRG generates diverse, high-quality initial solutions. Additionally, it constructs a compact trust-region subproblem to accelerate convergence. Evaluated on multiple public benchmarks, SRG outperforms current learning-based approaches and achieves solution quality comparable to state-of-the-art exact solvers under zero-shot cross-scale and cross-problem transfer settings, while substantially reducing computational overhead.
📝 Abstract
Predict-and-search (PaS) methods have shown promise for accelerating mixed-integer linear programming (MILP) solving. However, existing approaches typically assume variable independence and rely on deterministic single-point predictions, which limits solution diversityand often necessitates extensive downstream search for high-quality solutions. In this paper, we propose \textbf{SRG}, a generative framework based on Lagrangian relaxation-guided stochastic differential equations (SDEs), with theoretical guarantees on solution quality. SRG leverages convolutional kernels to capture inter-variable dependencies while integrating Lagrangian relaxation to guide the sampling process toward feasible and near-optimal regions. Rather than producing a single estimate, SRG generates diverse, high-quality solution candidates that collectively define compact and effective trust-region subproblems for standard MILP solvers. Across multiple public benchmarks, SRG consistently outperforms existing machine learning baselines in solution quality. Moreover, SRG demonstrates strong zero-shot transferability: on unseen cross-scale/problem instances, it achieves competitive optimality with state-of-the-art exact solvers while significantly reducing computational overhead through faster search and superior solution quality.