🤖 AI Summary
This work proposes a cellular ring-coupled architecture based on Kuramoto oscillators to overcome the linear storage capacity limitation of classical associative memory models such as Hopfield networks. In this framework, memories are encoded as stable phase-locked configurations. Theoretical analysis demonstrates that the system achieves exponential storage capacity—specifically, $N$ oscillators can store $(2 \lceil n_c/4 \rceil - 1)^m$ distinct patterns—with each basin of attraction guaranteed a minimum size independent of network scale. Numerical simulations confirm the stability of the phase-locking dynamics and validate the hardware feasibility using charge density wave (CDW) oscillators, thereby substantially surpassing the capacity constraints inherent in traditional models.
📝 Abstract
Associative memory systems enable content-addressable storage and retrieval of patterns, a capability central to biological neural computation and artificial intelligence. Classical implementations such as Hopfield networks face fundamental limitations in memory capacity, scaling at most linearly with network size. We present an associative memory architecture based on Kuramoto oscillator networks with honeycomb topology in which memories are encoded as stable phase-locked configurations. The honeycomb network consists of multiple cycles that share nodes in a chain-like arrangement, creating a one-dimensional lattice of chained+loops. We prove that this architecture achieves exponential memory capacity: a network of $N$ oscillators can store $(2\lceil n_c/4 \rceil - 1)^m$ distinct patterns, where $m$ honeycomb cycles each contain $n_c$ oscillators. Moreover, we fully characterize all stable configurations and prove that each memory's basin of attraction maintains a guaranteed minimum size independent of network scale. Simulations using charge-density-wave (CDW) oscillators validate predicted phase-locking behavior, demonstrating practical realizability in neuromorphic hardware.