Reservoir Computation with Networks of Differentiating Neuron Ring Oscillators

📅 2025-07-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional reservoir computing relies on continuously active integrate-and-fire neurons, resulting in high static power consumption. To address this, this work proposes a novel reservoir substrate comprising event-driven differential neurons arranged in a small-world topological ring oscillator network. This design triggers responses only upon input changes, significantly reducing static power dissipation and enhancing dynamic response efficiency. Methodologically, we construct a coupled differential neuron oscillator network embedded within the reservoir computing framework and evaluate its performance on the MNIST handwritten digit recognition task. Experimental results demonstrate a classification accuracy of 90.65%, comparable to state-of-the-art reservoir computing approaches—marking the first empirical validation that differential neurons can effectively support the reservoir computing paradigm. This work establishes a new architectural foundation and provides empirical evidence for low-power, event-driven neuromorphic AI systems.

Technology Category

Application Category

📝 Abstract
Reservoir Computing is a machine learning approach that uses the rich repertoire of complex system dynamics for function approximation. Current approaches to reservoir computing use a network of coupled integrating neurons that require a steady current to maintain activity. Here, we introduce a small world graph of differentiating neurons that are active only when there are changes in input as an alternative to integrating neurons as a reservoir computing substrate. We find the coupling strength and network topology that enable these small world networks to function as an effective reservoir. We demonstrate the efficacy of these networks in the MNIST digit recognition task, achieving comparable performance of 90.65% to existing reservoir computing approaches. The findings suggest that differentiating neurons can be a potential alternative to integrating neurons and can provide a sustainable future alternative for power-hungry AI applications.
Problem

Research questions and friction points this paper is trying to address.

Exploring differentiating neurons as reservoir computing substrate
Optimizing coupling strength and network topology for effective reservoirs
Demonstrating MNIST digit recognition performance comparable to existing methods
Innovation

Methods, ideas, or system contributions that make the work stand out.

Differentiating neuron ring oscillators network
Small world graph topology optimization
MNIST digit recognition performance validation
🔎 Similar Papers
No similar papers found.
Alexander Yeung
Alexander Yeung
College of Computer & Information Sciences, University of Massachusetts Amherst
P
Peter DelMastro
Department of Mathematics, Virginia Tech
A
Arjun Karuvally
College of Computer & Information Sciences, University of Massachusetts Amherst
Hava Siegelmann
Hava Siegelmann
Professor of Computer Science, and Brain Sciences, UMass Amherst
AIComputational NeuroscienceComputational Biology
E
Edward Rietman
College of Computer & Information Sciences, University of Massachusetts Amherst
Hananel Hazan
Hananel Hazan
Allen Discovery Center, Tufts University, USA
NeurocomputationNeural NetworkMachine LearningArtificial IntelligenceReal-Time Systems