Quantum-Informed Contrastive Learning with Dynamic Mixup Augmentation for Class-Imbalanced Expert Systems

📅 2025-06-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses robust classification of imbalanced tabular data in expert systems, with particular emphasis on high-precision detection of rare yet critical samples. To this end, we propose a novel framework comprising: (1) quantum entanglement-inspired feature interaction modeling, leveraging sinusoidal transformation and gated attention to enhance representation of low-frequency critical patterns; (2) kNN-guided semantic-aware dynamic MixUp to mitigate poor generalization in low-density regions; and (3) a multi-objective hybrid optimization mechanism integrating focal-weighted loss, supervised contrastive learning, triplet margin loss, and variance regularization. Evaluated on 18 real-world imbalanced tabular datasets, our method consistently outperforms 20 state-of-the-art approaches, achieving average improvements of 4.2% in macro-F1 and 6.8% in recall—establishing a new benchmark for tabular imbalance learning.

Technology Category

Application Category

📝 Abstract
Expert systems often operate in domains characterized by class-imbalanced tabular data, where detecting rare but critical instances is essential for safety and reliability. While conventional approaches, such as cost-sensitive learning, oversampling, and graph neural networks, provide partial solutions, they suffer from drawbacks like overfitting, label noise, and poor generalization in low-density regions. To address these challenges, we propose QCL-MixNet, a novel Quantum-Informed Contrastive Learning framework augmented with k-nearest neighbor (kNN) guided dynamic mixup for robust classification under imbalance. QCL-MixNet integrates three core innovations: (i) a Quantum Entanglement-inspired layer that models complex feature interactions through sinusoidal transformations and gated attention, (ii) a sample-aware mixup strategy that adaptively interpolates feature representations of semantically similar instances to enhance minority class representation, and (iii) a hybrid loss function that unifies focal reweighting, supervised contrastive learning, triplet margin loss, and variance regularization to improve both intra-class compactness and inter-class separability. Extensive experiments on 18 real-world imbalanced datasets (binary and multi-class) demonstrate that QCL-MixNet consistently outperforms 20 state-of-the-art machine learning, deep learning, and GNN-based baselines in macro-F1 and recall, often by substantial margins. Ablation studies further validate the critical role of each architectural component. Our results establish QCL-MixNet as a new benchmark for tabular imbalance handling in expert systems. Theoretical analyses reinforce its expressiveness, generalization, and optimization robustness.
Problem

Research questions and friction points this paper is trying to address.

Address class-imbalanced data in expert systems
Improve detection of rare critical instances
Enhance robustness and generalization in classification
Innovation

Methods, ideas, or system contributions that make the work stand out.

Quantum Entanglement-inspired layer models feature interactions
Dynamic mixup strategy enhances minority class representation
Hybrid loss function improves class compactness and separability
🔎 Similar Papers
No similar papers found.
Md Abrar Jahin
Md Abrar Jahin
Center on Knowledge Graphs, Information Sciences Institute, University of Southern California
Deep LearningQuantum Machine LearningGeometric Deep LearningTrustworthy AI
A
Adiba Abid
Department of Industrial Engineering and Management, Khulna University of Engineering & Technology (KUET), Khulna, 9203, Bangladesh
M
M. F. Mridha
Department of Computer Science, American International University-Bangladesh (AIUB), Dhaka, 1229, Bangladesh