Joint graph entropy knowledge distillation for point cloud classification and robustness against corruptions

📅 2025-09-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the neglect of inter-class semantic correlations and the disruption of non-IID data structures in 3D point cloud classification, this paper proposes a Joint Graph Entropy Knowledge Distillation (JGE-KD) framework. Methodologically, it introduces joint graph entropy for the first time to construct a discriminative loss function that explicitly models inter-class semantic dependencies; further, it designs a dual-branch Siamese architecture integrating self-distillation and teacher-student distillation to enable robust knowledge transfer between original point clouds and their perturbed variants. The key contribution lies in unifying the modeling of inter-class dependencies and sample-level perturbation invariance. Extensive experiments demonstrate significant improvements in generalization and robustness across benchmarks—including ScanObjectNN, ModelNet40, ScanNetV2_cls, and ModelNet-C—particularly under diverse point cloud corruptions such as dropout, noise, and occlusion.

Technology Category

Application Category

📝 Abstract
Classification tasks in 3D point clouds often assume that class events eplaced{are }{follow }independent and identically distributed (IID), although this assumption destroys the correlation between classes. This eplaced{study }{paper }proposes a classification strategy, extbf{J}oint extbf{G}raph extbf{E}ntropy extbf{K}nowledge extbf{D}istillation (JGEKD), suitable for non-independent and identically distributed 3D point cloud data, eplaced{which }{the strategy } achieves knowledge transfer of class correlations through knowledge distillation by constructing a loss function based on joint graph entropy. Firstdeleted{ly}, we employ joint graphs to capture add{the }hidden relationships between classes eplaced{ and}{,} implement knowledge distillation to train our model by calculating the entropy of add{add }graph. eplaced{ Subsequently}{ Then}, to handle 3D point clouds deleted{that is }invariant to spatial transformations, we construct eplaced{S}{s}iamese structures and develop two frameworks, self-knowledge distillation and teacher-knowledge distillation, to facilitate information transfer between different transformation forms of the same data. eplaced{In addition}{ Additionally}, we use the above framework to achieve knowledge transfer between point clouds and their corrupted forms, and increase the robustness against corruption of model. Extensive experiments on ScanObject, ModelNet40, ScanntV2_cls and ModelNet-C demonstrate that the proposed strategy can achieve competitive results.
Problem

Research questions and friction points this paper is trying to address.

Addresses non-IID class correlations in 3D point cloud classification
Proposes joint graph entropy distillation for knowledge transfer
Enhances model robustness against point cloud corruptions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Joint graph entropy captures class correlations
Siamese structures handle spatial transformation invariance
Knowledge distillation transfers between clean and corrupted data
🔎 Similar Papers
No similar papers found.