🤖 AI Summary
The regulatory role of astrocyte-like units in spiking neural networks (SNNs) and the impact of astrocyte–neuron ratio on learning performance remain poorly understood. Method: We propose a hybrid SNN architecture inspired by liquid state machines (LSMs), integrating biologically plausible astrocytic dynamics for chaotic time-series prediction. Contribution/Results: We demonstrate for the first time that astrocyte-like units significantly modulate synaptic plasticity and temporal information integration. Empirical results show that a 2:1 astrocyte-to-neuron ratio yields peak learning speed, along with superior prediction accuracy and robustness compared to purely neuronal or purely astrocytic networks. Notably, this optimal ratio closely matches the empirically observed astrocyte–neuron ratio in mammalian brains, providing strong evidence for astrocytes’ critical role in long-term dynamic representation and cooperative learning. Our work establishes a novel, biologically interpretable neuromorphic computing paradigm grounded in glial–neuronal co-processing principles.
📝 Abstract
Traditional artificial neural networks take inspiration from biological networks, using layers of neuron-like nodes to pass information for processing. More realistic models include spiking in the neural network, capturing the electrical characteristics more closely. However, a large proportion of brain cells are of the glial cell type, in particular astrocytes which have been suggested to play a role in performing computations. Here, we introduce a modified spiking neural network model with added astrocyte-like units in a neural network and asses their impact on learning. We implement the network as a liquid state machine and task the network with performing a chaotic time-series prediction task. We varied the number and ratio of neuron-like and astrocyte-like units in the network to examine the latter units effect on learning. We show that the combination of neurons and astrocytes together, as opposed to neural- and astrocyte-only networks, are critical for driving learning. Interestingly, we found that the highest learning rate was achieved when the ratio between astrocyte-like and neuron-like units was roughly 2 to 1, mirroring some estimates of the ratio of biological astrocytes to neurons. Our results demonstrate that incorporating astrocyte-like units which represent information across longer timescales can alter the learning rates of neural networks, and the proportion of astrocytes to neurons should be tuned appropriately to a given task.