🤖 AI Summary
Addressing the challenges of modeling long-range dependencies, uncontrolled memory decay, and high computational complexity in event-driven learning, this paper proposes the first adaptive hybrid framework integrating spiking neural dynamics with structured state space models (SSMs). Our approach introduces two key innovations: (1) a spike-aware HiPPO mechanism that dynamically modulates memory retention strength based on inter-spike intervals; and (2) an NPLR matrix decomposition technique that reduces SSM inference complexity from $O(L^2)$ to $O(L log L)$, overcoming classical scalability bottlenecks. Evaluated on benchmark long-range sequence tasks (Long Range Arena) and real-world neuromorphic datasets (HAR-DVS, Celex-HAR), our method achieves state-of-the-art accuracy in long-range reasoning while incurring significantly lower computational overhead. It thus markedly improves both modeling efficiency and generalization capability for event sequences.
📝 Abstract
We propose extbf{FLAMES (Fast Long-range Adaptive Memory for Event-based Systems)}, a novel hybrid framework integrating structured state-space dynamics with event-driven computation. At its core, the extit{Spike-Aware HiPPO (SA-HiPPO) mechanism} dynamically adjusts memory retention based on inter-spike intervals, preserving both short- and long-range dependencies. To maintain computational efficiency, we introduce a normal-plus-low-rank (NPLR) decomposition, reducing complexity from $mathcal{O}(N^2)$ to $mathcal{O}(Nr)$. FLAMES achieves state-of-the-art results on the Long Range Arena benchmark and event datasets like HAR-DVS and Celex-HAR. By bridging neuromorphic computing and structured sequence modeling, FLAMES enables scalable long-range reasoning in event-driven systems.