The Influence of Initial Connectivity on Biologically Plausible Learning

📅 2024-10-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates how initial synaptic connectivity influences the training efficacy of biologically plausible learning rules—such as local, synapse-computable updates—in recurrent neural networks (RNNs). Addressing the gap in prior work, which neglects the synergistic effects of initialization and biological constraints, we systematically analyze how initial weight magnitude critically regulates information propagation dynamics and learning performance. We propose the first Lyapunov-exponent-based regularization method tailored for biologically plausible learning—extended gradient flossing—which stabilizes forward dynamics and substantially improves training success rates and generalization. Experiments reveal that distinct initializations impose divergent stability requirements on information flow. Our framework not only yields testable neurodynamic predictions for synaptic plasticity mechanisms but also establishes a novel co-design paradigm for weight initialization and learning algorithms in neuromorphic hardware.

Technology Category

Application Category

📝 Abstract
Understanding how the brain learns can be advanced by investigating biologically plausible learning rules -- those that obey known biological constraints, such as locality, to serve as valid brain learning models. Yet, many studies overlook the role of architecture and initial synaptic connectivity in such models. Building on insights from deep learning, where initialization profoundly affects learning dynamics, we ask a key but underexplored neuroscience question: how does initial synaptic connectivity shape learning in neural circuits? To investigate this, we train recurrent neural networks (RNNs), which are widely used for brain modeling, with biologically plausible learning rules. Our findings reveal that initial weight magnitude significantly influences the learning performance of such rules, mirroring effects previously observed in training with backpropagation through time (BPTT). By examining the maximum Lyapunov exponent before and after training, we uncovered the greater demands that certain initialization schemes place on training to achieve desired information propagation properties. Consequently, we extended the recently proposed gradient flossing method, which regularizes the Lyapunov exponents, to biologically plausible learning and observed an improvement in learning performance. To our knowledge, we are the first to examine the impact of initialization on biologically plausible learning rules for RNNs and to subsequently propose a biologically plausible remedy. Such an investigation can lead to neuroscientific predictions about the influence of initial connectivity on learning dynamics and performance, as well as guide neuromorphic design.
Problem

Research questions and friction points this paper is trying to address.

Initial Neural Connectivity
Recurrent Neural Networks (RNNs)
Brain Learning Rules
Innovation

Methods, ideas, or system contributions that make the work stand out.

Initial Conditions
Recurrent Neural Networks (RNNs)
Gradient Clipping
🔎 Similar Papers
No similar papers found.