Dynamic Stratified Contrastive Learning with Upstream Augmentation for MILP Branching

📅 2025-11-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenges of sparse upstream node samples, rapidly shifting semantic distributions across tree depth, and high computational cost of strong branching labels in MILP branch-and-bound, this paper proposes a dynamic hierarchical contrastive learning framework. The method performs adaptive hierarchical clustering based on node feature distributions, employs graph convolutional networks (GCNs) to model variable-constraint structures, and introduces upstream instance augmentation alongside theoretically grounded equivalent perturbations to mitigate data bias and generalization bottlenecks. A dynamic hierarchical contrastive loss enables fine-grained node discrimination and cross-depth semantic alignment. Evaluated on standard MILP benchmarks, the model achieves significant improvements: +8.2% branch selection accuracy and 19.6% reduction in average solving time, while demonstrating strong generalization to unseen problem instances.

Technology Category

Application Category

📝 Abstract
Mixed Integer Linear Programming (MILP) is a fundamental class of NP-hard problems that has garnered significant attention from both academia and industry. The Branch-and-Bound (B&B) method is the dominant approach for solving MILPs and the branching plays an important role in B&B methods. Neural-based learning frameworks have recently been developed to enhance branching policies and the efficiency of solving MILPs. However, these methods still struggle with semantic variation across depths, the scarcity of upstream nodes, and the costly collection of strong branching samples. To address these issues, we propose ours, a Dynamic underline{ extbf{S}}tratified underline{ extbf{C}}ontrastive Training Framework for underline{ extbf{MILP}} Branching. It groups branch-and-bound nodes based on their feature distributions and trains a GCNN-based discriminative model to progressively separate nodes across groups, learning finer-grained node representations throughout the tree. To address data scarcity and imbalance at upstream nodes, we introduce an upstream-augmented MILP derivation procedure that generates both theoretically equivalent and perturbed instances. ours~effectively models subtle semantic differences between nodes, significantly enhancing branching accuracy and solving efficiency, particularly for upstream nodes. Extensive experiments on standard MILP benchmarks demonstrate that our method enhances branching accuracy, reduces solving time, and generalizes effectively to unseen instances.
Problem

Research questions and friction points this paper is trying to address.

Addresses semantic variation across branch-and-bound tree depths
Mitigates data scarcity and imbalance at upstream nodes
Reduces dependency on costly strong branching samples
Innovation

Methods, ideas, or system contributions that make the work stand out.

Dynamic stratified contrastive learning for MILP branching
Upstream-augmented MILP derivation for data scarcity
GCNN-based discriminative model for node representation
🔎 Similar Papers
No similar papers found.
T
Tongkai Lu
SKLSDE Lab, Beihang University, Beijing, China
S
Shuai Ma
SKLSDE Lab, Beihang University, Beijing, China
Chongyang Tao
Chongyang Tao
Associate Professor of Computer Science, Beihang University
Natural Language ProcessingDialogue SystemsInformation RetrievalData Intelligence