🤖 AI Summary
A fundamental gap exists between artificial intelligence and computational neuroscience regarding learning mechanisms, particularly the reliance of mainstream AI on gradient-based optimization—a biologically implausible process.
Method: We propose a gradient-free, dynamic distributed learning paradigm based on an asymmetric deep recurrent neural network. Sparse excitatory connections are introduced to generate high-dimensional, dense, and geometrically stable internal representation manifolds; input–output mappings emerge naturally through recursive dynamics, and a geometrically grounded dynamic learning rule—leveraging the stability properties of attractor configurations—enables persistent convergence even after supervision is withdrawn.
Contribution/Results: The model achieves competitive performance on standard AI benchmarks while maintaining biological plausibility and computational scalability. It offers a novel, brain-inspired alternative to gradient-based learning, advancing the pursuit of neuroscientifically grounded, scalable, and differentiable-free machine learning.
📝 Abstract
We show that asymmetric deep recurrent neural networks, enhanced with additional sparse excitatory couplings, give rise to an exponentially large, dense accessible manifold of internal representations which can be found by different algorithms, including simple iterative dynamics. Building on the geometrical properties of the stable configurations, we propose a distributed learning scheme in which input-output associations emerge naturally from the recurrent dynamics, without any need of gradient evaluation. A critical feature enabling the learning process is the stability of the configurations reached at convergence, even after removal of the supervisory output signal. Extensive simulations demonstrate that this approach performs competitively on standard AI benchmarks. The model can be generalized in multiple directions, both computational and biological, potentially contributing to narrowing the gap between AI and computational neuroscience.