Towards Vector Optimization on Low-Dimensional Vector Symbolic Architecture

📅 2025-02-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the fundamental trade-off between accuracy and efficiency in low-dimensional Vector Symbolic Architectures (VSAs). To resolve this, we propose a stable and adaptive optimization framework for low-dimensional VSAs. Methodologically, the framework integrates gradient-driven vector optimization, adaptive dynamical updates, zero-overhead batch normalization (BN), and knowledge distillation (KD). We first uncover the mechanism by which BN yields accuracy gains in low-dimensional VSAs without incurring any inference latency—demonstrating its unique zero-cost benefit in this regime. We further show that KD substantially improves inference confidence. Additionally, we extend interpretability analysis to binary optimization within VSAs. Experiments demonstrate that our approach maintains original model accuracy under ~100× dimensionality compression; BN introduces no additional inference overhead; and KD significantly boosts confidence scores. Comprehensive evaluations across multiple benchmarks and ablation studies validate the effectiveness and robustness of the proposed framework.

Technology Category

Application Category

📝 Abstract
Vector Symbolic Architecture (VSA) is emerging in machine learning due to its efficiency, but they are hindered by issues of hyperdimensionality and accuracy. As a promising mitigation, the Low-Dimensional Computing (LDC) method significantly reduces the vector dimension by ~100 times while maintaining accuracy, by employing a gradient-based optimization. Despite its potential, LDC optimization for VSA is still underexplored. Our investigation into vector updates underscores the importance of stable, adaptive dynamics in LDC training. We also reveal the overlooked yet critical roles of batch normalization (BN) and knowledge distillation (KD) in standard approaches. Besides the accuracy boost, BN does not add computational overhead during inference, and KD significantly enhances inference confidence. Through extensive experiments and ablation studies across multiple benchmarks, we provide a thorough evaluation of our approach and extend the interpretability of binary neural network optimization similar to LDC, previously unaddressed in BNN literature.
Problem

Research questions and friction points this paper is trying to address.

Optimizing vector dimensions in VSA
Enhancing accuracy with LDC methods
Improving inference confidence via BN and KD
Innovation

Methods, ideas, or system contributions that make the work stand out.

Low-Dimensional Computing reduces vector dimension
Batch normalization enhances accuracy without overhead
Knowledge distillation significantly boosts inference confidence
🔎 Similar Papers
No similar papers found.