Similarity Matching Networks: Hebbian Learning and Convergence Over Multiple Time Scales

📅 2025-06-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing similarity-matching networks lack rigorous convergence analysis across multiple time scales. Method: We construct a continuous dynamical model featuring three distinct time scales—fast (neuronal activity), medium (lateral anti-Hebbian synapses), and slow (feedforward Hebbian synapses)—and establish, for the first time, a multiscale gradient-flow convergence framework applicable to strongly convex, strongly concave, and nonconvex nonsmooth settings. We further design a biologically interpretable network architecture grounded in a min-max-min objective. Contribution/Results: We rigorously prove global or almost-sure convergence of all dynamical layers to principal subspace solutions. By integrating optimization on the positive-definite matrix manifold with biologically plausible synaptic plasticity rules, numerical experiments validate both convergence and functional efficacy. Our theoretical results provide formal support for two fundamental neurocomputational hypotheses: (i) subspace learning via coordinated synaptic dynamics, and (ii) hierarchical temporal integration in unsupervised neural computation.

Technology Category

Application Category

📝 Abstract
A recent breakthrough in biologically-plausible normative frameworks for dimensionality reduction is based upon the similarity matching cost function and the low-rank matrix approximation problem. Despite clear biological interpretation, successful application in several domains, and experimental validation, a formal complete convergence analysis remains elusive. Building on this framework, we consider and analyze a continuous-time neural network, the emph{similarity matching network}, for principal subspace projection. Derived from a min-max-min objective, this biologically-plausible network consists of three coupled dynamics evolving at different time scales: neural dynamics, lateral synaptic dynamics, and feedforward synaptic dynamics at the fast, intermediate, and slow time scales, respectively. The feedforward and lateral synaptic dynamics consist of Hebbian and anti-Hebbian learning rules, respectively. By leveraging a multilevel optimization framework, we prove convergence of the dynamics in the offline setting. Specifically, at the first level (fast time scale), we show strong convexity of the cost function and global exponential convergence of the corresponding gradient-flow dynamics. At the second level (intermediate time scale), we prove strong concavity of the cost function and exponential convergence of the corresponding gradient-flow dynamics within the space of positive definite matrices. At the third and final level (slow time scale), we study a non-convex and non-smooth cost function, provide explicit expressions for its global minima, and prove almost sure convergence of the corresponding gradient-flow dynamics to the global minima. These results rely on two empirically motivated conjectures that are supported by thorough numerical experiments. Finally, we validate the effectiveness of our approach via a numerical example.
Problem

Research questions and friction points this paper is trying to address.

Analyzing convergence of similarity matching networks
Studying multi-time-scale neural and synaptic dynamics
Proving global convergence for biologically-plausible learning rules
Innovation

Methods, ideas, or system contributions that make the work stand out.

Continuous-time neural network for subspace projection
Hebbian and anti-Hebbian learning rules
Multilevel optimization for convergence proof
🔎 Similar Papers
No similar papers found.