Determinism in the Undetermined: Deterministic Output in Charge-Conserving Continuous-Time Neuromorphic Systems with Temporal Stochasticity

📅 2026-03-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of achieving deterministic computation in asynchronous neuromorphic systems, where timing stochasticity inherent in continuous-time hardware typically undermines reproducibility. The authors propose a unified continuous-time spiking neural network (SNN) framework grounded in the principle of charge conservation and minimal neuron constraints. They rigorously prove, for the first time, that in acyclic architectures, the network output depends exclusively on the total input charge and is entirely invariant to spike timing, thereby achieving complete immunity to temporal randomness. Furthermore, the framework establishes an exact, approximation-free correspondence between charge-conserving SNNs and quantized artificial neural networks, offering a theoretical foundation for integrating event-driven dynamic systems with static deep learning while preserving algorithmic determinism and enabling efficient asynchronous processing.

Technology Category

Application Category

📝 Abstract
Achieving deterministic computation results in asynchronous neuromorphic systems remains a fundamental challenge due to the inherent temporal stochasticity of continuous-time hardware. To address this, we develop a unified continuous-time framework for spiking neural networks (SNNs) that couples the Law of Charge Conservation with minimal neuron-level constraints. This integration ensures that the terminal state depends solely on the aggregate input charge, providing a unique cumulated output invariant to temporal stochasticity. We prove that this mapping is strictly invariant to spike timing in acyclic networks, whereas recurrent connectivity can introduce temporal sensitivity. Furthermore, we establish an exact representational correspondence between these charge-conserving SNNs and quantized artificial neural networks, bridging the gap between static deep learning and event-driven dynamics without approximation errors. These results establish a rigorous theoretical basis for designing continuous-time neuromorphic systems that harness the efficiency of asynchronous processing while maintaining algorithmic determinism.
Problem

Research questions and friction points this paper is trying to address.

determinism
temporal stochasticity
neuromorphic systems
spiking neural networks
charge conservation
Innovation

Methods, ideas, or system contributions that make the work stand out.

charge conservation
deterministic neuromorphic computing
temporal stochasticity
spiking neural networks
quantized ANNs
🔎 Similar Papers
No similar papers found.
J
Jing Yan
School of Mathematical Sciences, Institute of Natural Sciences and MOE-LSC, Shanghai Jiao Tong University, Shanghai, China
K
Kang You
School of Electronic Information and Electrical Engineering, Shanghai Jiao Tong University, Shanghai, China; Shanghai Artificial Intelligence Laboratory, Shanghai, China
Zhezhi He
Zhezhi He
Associate Professor, Shanghai Jiao Tong University
Intelligent ComputingNeuromorphic ComputingComputer ArchitectureEDA
Yaoyu Zhang
Yaoyu Zhang
Shanghai Jiao Tong University
Deep Learning Theory