🤖 AI Summary
Existing hyperdimensional computing (HDC) models struggle to efficiently represent and decompose multi-object, hierarchical class–subclass relationships, thereby limiting factorized reasoning capabilities in neurosymbolic AI. To address this, we propose FactorHD—a novel HDC framework that jointly mitigates the superposition catastrophe and the “2-problem” via a symbolic encoding mechanism with memory clauses and a selective elimination factorization algorithm. FactorHD integrates hyperdimensional representation, symbolic structural priors, and memory-augmented mechanisms, and is jointly trained end-to-end with ResNet-18. Evaluated in a 10⁹-dimensional space, FactorHD achieves a 5,667× speedup over baseline HDC methods and attains 92.48% factorized accuracy on CIFAR-10. The framework significantly enhances both interpretability in modeling high-dimensional semantic structures and efficiency in factorized inference.
📝 Abstract
Neuro-symbolic artificial intelligence (neuro-symbolic AI) excels in logical analysis and reasoning. Hyperdimensional Computing (HDC), a promising brain-inspired computational model, is integral to neuro-symbolic AI. Various HDC models have been proposed to represent class-instance and class-class relations, but when representing the more complex class-subclass relation, where multiple objects associate different levels of classes and subclasses, they face challenges for factorization, a crucial task for neuro-symbolic AI systems. In this article, we propose FactorHD, a novel HDC model capable of representing and factorizing the complex class-subclass relation efficiently. FactorHD features a symbolic encoding method that embeds an extra memorization clause, preserving more information for multiple objects. In addition, it employs an efficient factorization algorithm that selectively eliminates redundant classes by identifying the memorization clause of the target class. Such model significantly enhances computing efficiency and accuracy in representing and factorizing multiple objects with class-subclass relation, overcoming limitations of existing HDC models such as "superposition catastrophe" and "the problem of 2". Evaluations show that FactorHD achieves approximately 5667x speedup at a representation size of 10^9 compared to existing HDC models. When integrated with the ResNet-18 neural network, FactorHD achieves 92.48% factorization accuracy on the Cifar-10 dataset.