🤖 AI Summary
This work addresses robust classification of imbalanced tabular data in expert systems, with particular emphasis on high-precision detection of rare yet critical samples. To this end, we propose a novel framework comprising: (1) quantum entanglement-inspired feature interaction modeling, leveraging sinusoidal transformation and gated attention to enhance representation of low-frequency critical patterns; (2) kNN-guided semantic-aware dynamic MixUp to mitigate poor generalization in low-density regions; and (3) a multi-objective hybrid optimization mechanism integrating focal-weighted loss, supervised contrastive learning, triplet margin loss, and variance regularization. Evaluated on 18 real-world imbalanced tabular datasets, our method consistently outperforms 20 state-of-the-art approaches, achieving average improvements of 4.2% in macro-F1 and 6.8% in recall—establishing a new benchmark for tabular imbalance learning.
📝 Abstract
Expert systems often operate in domains characterized by class-imbalanced tabular data, where detecting rare but critical instances is essential for safety and reliability. While conventional approaches, such as cost-sensitive learning, oversampling, and graph neural networks, provide partial solutions, they suffer from drawbacks like overfitting, label noise, and poor generalization in low-density regions. To address these challenges, we propose QCL-MixNet, a novel Quantum-Informed Contrastive Learning framework augmented with k-nearest neighbor (kNN) guided dynamic mixup for robust classification under imbalance. QCL-MixNet integrates three core innovations: (i) a Quantum Entanglement-inspired layer that models complex feature interactions through sinusoidal transformations and gated attention, (ii) a sample-aware mixup strategy that adaptively interpolates feature representations of semantically similar instances to enhance minority class representation, and (iii) a hybrid loss function that unifies focal reweighting, supervised contrastive learning, triplet margin loss, and variance regularization to improve both intra-class compactness and inter-class separability. Extensive experiments on 18 real-world imbalanced datasets (binary and multi-class) demonstrate that QCL-MixNet consistently outperforms 20 state-of-the-art machine learning, deep learning, and GNN-based baselines in macro-F1 and recall, often by substantial margins. Ablation studies further validate the critical role of each architectural component. Our results establish QCL-MixNet as a new benchmark for tabular imbalance handling in expert systems. Theoretical analyses reinforce its expressiveness, generalization, and optimization robustness.