Modern Hopfield Networks with Continuous-Time Memories

📅 2025-02-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the inefficiency and poor scalability of discrete Hopfield networks in large-scale memory storage, this paper proposes a continuous-time memory mechanism that compresses high-dimensional discrete memories into a compact continuous probability density space. Methodologically, it innovatively incorporates psychological theories of neural resource allocation to formulate a probabilistic-density-based continuous attention energy function, unifying the modeling of working-memory attractor dynamics and resource-efficient allocation. A continuous-time update rule is designed wherein probability densities replace conventional softmax distributions, eliminating explicit storage of individual memory items. Experiments on synthetic and video datasets demonstrate performance competitive with classical Hopfield networks while reducing memory footprint by approximately 60% and computational overhead by over 45%. This work establishes a scalable, resource-aware paradigm for high-capacity associative memory.

Technology Category

Application Category

📝 Abstract
Recent research has established a connection between modern Hopfield networks (HNs) and transformer attention heads, with guarantees of exponential storage capacity. However, these models still face challenges scaling storage efficiently. Inspired by psychological theories of continuous neural resource allocation in working memory, we propose an approach that compresses large discrete Hopfield memories into smaller, continuous-time memories. Leveraging continuous attention, our new energy function modifies the update rule of HNs, replacing the traditional softmax-based probability mass function with a probability density, over the continuous memory. This formulation aligns with modern perspectives on human executive function, offering a principled link between attractor dynamics in working memory and resource-efficient memory allocation. Our framework maintains competitive performance with HNs while leveraging a compressed memory, reducing computational costs across synthetic and video datasets.
Problem

Research questions and friction points this paper is trying to address.

Enhancing Hopfield Networks efficiency
Continuous-time memory compression
Reducing computational costs effectively
Innovation

Methods, ideas, or system contributions that make the work stand out.

Continuous-time memory compression
Modified energy function update
Probability density replaces softmax
🔎 Similar Papers
No similar papers found.
S
Saul Santos
Instituto de Telecomunicacoes, Instituto Superior Tecnico, Universidade de Lisboa
António Farinhas
António Farinhas
Sword Health
Machine LearningNatural Language Processing
D
Daniel C. McNamee
Champalimaud Research
A
André F. T. Martins
Instituto de Telecomunicacoes, Instituto Superior Tecnico, Universidade de Lisboa, Unbabel, ELLIS Unit Lisbon