BiHDTrans: binary hyperdimensional transformer for efficient multivariate time series classification

📅 2025-09-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge of achieving both high accuracy and efficiency in multivariate time series (MTS) classification on resource-constrained edge devices, this paper proposes the first binary hyperdimensional Transformer architecture that synergistically integrates hyperdimensional computing with self-attention. We innovatively embed self-attention into a binary high-dimensional space, theoretically proving reduced information loss; leveraging the i.i.d. property of high-dimensional vectors, we further accelerate inference via FPGA-based pipelined hardware implementation. Experiments on standard MTS classification benchmarks demonstrate that our method achieves ≥14.47% higher accuracy than state-of-the-art hyperdimensional models and an average of 6.67% higher accuracy than binary Transformers. On FPGA, it reduces inference latency by 39.4×, compresses model size by 4.4×, and additionally lowers latency by 49.8%.

Technology Category

Application Category

📝 Abstract
The proliferation of Internet-of-Things (IoT) devices has led to an unprecedented volume of multivariate time series (MTS) data, requiring efficient and accurate processing for timely decision-making in resource-constrained edge environments. Hyperdimensional (HD) computing, with its inherent efficiency and parallelizability, has shown promise in classification tasks but struggles to capture complex temporal patterns, while Transformers excel at sequence modeling but incur high computational and memory overhead. We introduce BiHDTrans, an efficient neurosymbolic binary hyperdimensional Transformer that integrates self-attention into the HD computing paradigm, unifying the representational efficiency of HD computing with the temporal modeling power of Transformers. Empirically, BiHDTrans outperforms state-of-the-art (SOTA) HD computing models by at least 14.47% and achieves 6.67% higher accuracy on average than SOTA binary Transformers. With hardware acceleration on FPGA, our pipelined implementation leverages the independent and identically distributed properties of high-dimensional representations, delivering 39.4 times lower inference latency than SOTA binary Transformers. Theoretical analysis shows that binarizing in holographic high-dimensional space incurs significantly less information distortion than directly binarizing neural networks, explaining BiHDTrans's superior accuracy. Furthermore, dimensionality experiments confirm that BiHDTrans remains competitive even with a 64% reduction in hyperspace dimensionality, surpassing SOTA binary Transformers by 1-2% in accuracy with 4.4 times less model size, as well as further reducing the latency by 49.8% compare to the full-dimensional baseline. Together, these contributions bridge the gap between the expressiveness of Transformers and the efficiency of HD computing, enabling accurate, scalable, and low-latency MTS classification.
Problem

Research questions and friction points this paper is trying to address.

Efficient multivariate time series classification for resource-constrained IoT environments
Integrating temporal modeling of Transformers with efficiency of hyperdimensional computing
Reducing computational overhead while maintaining high classification accuracy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrates self-attention into hyperdimensional computing paradigm
Binarizes holographic high-dimensional space to reduce distortion
Leverages FPGA acceleration for low-latency inference
🔎 Similar Papers
No similar papers found.
J
Jingtao Zhang
School of Biomedical Engineering, Shenzhen Campus of Sun Yat-sen University
Y
Yi Liu
School of Biomedical Engineering, Shenzhen Campus of Sun Yat-sen University
Qi Shen
Qi Shen
Active Materials and Smart Living Laboratory, University of Nevada, Las Vegas
Soft roboticsSmart MaterialsBioinspirationPhysical modelingActuators/Sensors
C
Changhong Wang
School of Biomedical Engineering, Shenzhen Campus of Sun Yat-sen University