Dynamical stability for dense patterns in discrete attractor neural networks

📅 2025-07-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the dynamic stability of discrete attractor neural networks under high memory load, transcending classical critical capacity limits by introducing the novel concept of “critical load.” Employing a synergistic approach combining nonlinear dynamical systems stability analysis, random matrix theory, and mean-field methods, we perform Jacobian spectral analysis on noisy, hierarchically structured neural activity networks—distinguishing bulk eigenvalue statistics from outlier eigenvalues. We uncover, for the first time, how threshold-linear activation functions cooperate with quasi-sparse activity patterns to enhance stability. This yields a general local stability theory applicable to broad classes of neural activity models. The theory precisely predicts the stability of all fixed points at the critical load, rigorously establishing the feasibility of dynamic stability even under high-density pattern storage. Our framework provides new theoretical principles and a quantitative foundation for understanding the robustness of biological memory.

Technology Category

Application Category

📝 Abstract
Neural networks storing multiple discrete attractors are canonical models of biological memory. Previously, the dynamical stability of such networks could only be guaranteed under highly restrictive conditions. Here, we derive a theory of the local stability of discrete fixed points in a broad class of networks with graded neural activities and in the presence of noise. By directly analyzing the bulk and outliers of the Jacobian spectrum, we show that all fixed points are stable below a critical load that is distinct from the classical extit{critical capacity} and depends on the statistics of neural activities in the fixed points as well as the single-neuron activation function. Our analysis highlights the computational benefits of threshold-linear activation and sparse-like patterns.
Problem

Research questions and friction points this paper is trying to address.

Ensuring dynamical stability in dense neural network patterns
Overcoming restrictive conditions for discrete attractor stability
Determining critical load for fixed-point stability in noisy networks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Derive theory for local stability in networks
Analyze Jacobian spectrum bulk and outliers
Highlight benefits of threshold-linear activation
🔎 Similar Papers
No similar papers found.