🤖 AI Summary
Traditional Dempster–Shafer theory (DST) classifiers suffer from oversimplified membership function modeling and limited expressiveness of basic probability assignments (BPAs) in multi-attribute classification. To address these issues, this paper proposes a belief-structure-based attribute fusion classifier. It innovatively employs Gaussian or Gaussian mixture models to adaptively construct membership functions and, for the first time, systematically transforms possibility distributions into normalized BPAs—significantly enhancing uncertainty representation. Furthermore, the method integrates cross-validation with custom evaluation metrics for optimal model selection and embeds the framework within an evidential k-nearest neighbors architecture. Experiments on benchmark datasets demonstrate that the proposed approach achieves an average accuracy improvement of 4.84% over state-of-the-art evidential classifiers, with lower variance, superior robustness, and enhanced generalization capability.
📝 Abstract
Dempster-Shafer Theory (DST) provides a powerful framework for modeling uncertainty and has been widely applied to multi-attribute classification tasks. However, traditional DST-based attribute fusion-based classifiers suffer from oversimplified membership function modeling and limited exploitation of the belief structure brought by basic probability assignment (BPA), reducing their effectiveness in complex real-world scenarios. This paper presents an enhanced attribute fusion-based classifier that addresses these limitations through two key innovations. First, we adopt a selective modeling strategy that utilizes both single Gaussian and Gaussian Mixture Models (GMMs) for membership function construction, with model selection guided by cross-validation and a tailored evaluation metric. Second, we introduce a novel method to transform the possibility distribution into a BPA by combining simple BPAs derived from normalized possibility distributions, enabling a much richer and more flexible representation of uncertain information. Furthermore, we apply the belief structure-based BPA generation method to the evidential K-Nearest Neighbors classifier, enhancing its ability to incorporate uncertainty information into decision-making. Comprehensive experiments on benchmark datasets are conducted to evaluate the performance of the proposed attribute fusion-based classifier and the enhanced evidential K-Nearest Neighbors classifier in comparison with both evidential classifiers and conventional machine learning classifiers. The results demonstrate that our proposed classifier outperforms the best existing evidential classifier, achieving an average accuracy improvement of 4.84%, while maintaining low variance, thus confirming its superior effectiveness and robustness.