🤖 AI Summary
To address the neglect of inter-class semantic correlations and the disruption of non-IID data structures in 3D point cloud classification, this paper proposes a Joint Graph Entropy Knowledge Distillation (JGE-KD) framework. Methodologically, it introduces joint graph entropy for the first time to construct a discriminative loss function that explicitly models inter-class semantic dependencies; further, it designs a dual-branch Siamese architecture integrating self-distillation and teacher-student distillation to enable robust knowledge transfer between original point clouds and their perturbed variants. The key contribution lies in unifying the modeling of inter-class dependencies and sample-level perturbation invariance. Extensive experiments demonstrate significant improvements in generalization and robustness across benchmarks—including ScanObjectNN, ModelNet40, ScanNetV2_cls, and ModelNet-C—particularly under diverse point cloud corruptions such as dropout, noise, and occlusion.
📝 Abstract
Classification tasks in 3D point clouds often assume that class events
eplaced{are }{follow }independent and identically distributed (IID), although this assumption destroys the correlation between classes. This
eplaced{study }{paper }proposes a classification strategy, extbf{J}oint extbf{G}raph extbf{E}ntropy extbf{K}nowledge extbf{D}istillation (JGEKD), suitable for non-independent and identically distributed 3D point cloud data,
eplaced{which }{the strategy } achieves knowledge transfer of class correlations through knowledge distillation by constructing a loss function based on joint graph entropy. Firstdeleted{ly}, we employ joint graphs to capture add{the }hidden relationships between classes
eplaced{ and}{,} implement knowledge distillation to train our model by calculating the entropy of add{add }graph.
eplaced{ Subsequently}{ Then}, to handle 3D point clouds deleted{that is }invariant to spatial transformations, we construct
eplaced{S}{s}iamese structures and develop two frameworks, self-knowledge distillation and teacher-knowledge distillation, to facilitate information transfer between different transformation forms of the same data.
eplaced{In addition}{ Additionally}, we use the above framework to achieve knowledge transfer between point clouds and their corrupted forms, and increase the robustness against corruption of model. Extensive experiments on ScanObject, ModelNet40, ScanntV2_cls and ModelNet-C demonstrate that the proposed strategy can achieve competitive results.