🤖 AI Summary
Traditional reservoir computing relies on continuously active integrate-and-fire neurons, resulting in high static power consumption. To address this, this work proposes a novel reservoir substrate comprising event-driven differential neurons arranged in a small-world topological ring oscillator network. This design triggers responses only upon input changes, significantly reducing static power dissipation and enhancing dynamic response efficiency. Methodologically, we construct a coupled differential neuron oscillator network embedded within the reservoir computing framework and evaluate its performance on the MNIST handwritten digit recognition task. Experimental results demonstrate a classification accuracy of 90.65%, comparable to state-of-the-art reservoir computing approaches—marking the first empirical validation that differential neurons can effectively support the reservoir computing paradigm. This work establishes a new architectural foundation and provides empirical evidence for low-power, event-driven neuromorphic AI systems.
📝 Abstract
Reservoir Computing is a machine learning approach that uses the rich repertoire of complex system dynamics for function approximation. Current approaches to reservoir computing use a network of coupled integrating neurons that require a steady current to maintain activity. Here, we introduce a small world graph of differentiating neurons that are active only when there are changes in input as an alternative to integrating neurons as a reservoir computing substrate. We find the coupling strength and network topology that enable these small world networks to function as an effective reservoir. We demonstrate the efficacy of these networks in the MNIST digit recognition task, achieving comparable performance of 90.65% to existing reservoir computing approaches. The findings suggest that differentiating neurons can be a potential alternative to integrating neurons and can provide a sustainable future alternative for power-hungry AI applications.