🤖 AI Summary
To address the challenges of complex cross-layer KPI interactions and scarce expert annotations in root-cause diagnosis of QoE degradation in mobile networks, this paper proposes an end-to-end diagnostic framework that jointly leverages weakly supervised data and domain expertise. Methodologically, it introduces a class-conditional diffusion model for time-series data augmentation, integrated with denoising contrastive pretraining and expert-label-guided fine-tuning to achieve noise-robust representation learning and semantic alignment under few-shot conditions. The framework unifies data-driven and knowledge-guided diagnostic pathways within a single architecture. Evaluated on real-world operator data, it achieves state-of-the-art performance, significantly outperforming conventional machine learning and semi-supervised time-series classification methods in root-cause identification accuracy. Results demonstrate both effectiveness in complex cross-layer attribution tasks and strong generalization capability.
📝 Abstract
Diagnosing the root causes of Quality of Experience (QoE) degradations in operational mobile networks is challenging due to complex cross-layer interactions among kernel performance indicators (KPIs) and the scarcity of reliable expert annotations. Although rule-based heuristics can generate labels at scale, they are noisy and coarse-grained, limiting the accuracy of purely data-driven approaches. To address this, we propose DK-Root, a joint data-and-knowledge-driven framework that unifies scalable weak supervision with precise expert guidance for robust root-cause analysis. DK-Root first pretrains an encoder via contrastive representation learning using abundant rule-based labels while explicitly denoising their noise through a supervised contrastive objective. To supply task-faithful data augmentation, we introduce a class-conditional diffusion model that generates KPIs sequences preserving root-cause semantics, and by controlling reverse diffusion steps, it produces weak and strong augmentations that improve intra-class compactness and inter-class separability. Finally, the encoder and the lightweight classifier are jointly fine-tuned with scarce expert-verified labels to sharpen decision boundaries. Extensive experiments on a real-world, operator-grade dataset demonstrate state-of-the-art accuracy, with DK-Root surpassing traditional ML and recent semi-supervised time-series methods. Ablations confirm the necessity of the conditional diffusion augmentation and the pretrain-finetune design, validating both representation quality and classification gains.