Geometric Entropy and Retrieval Phase Transitions in Continuous Thermal Dense Associative Memory

📅 2026-04-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates the thermodynamic memory capacity and retrieval phase transitions of continuous-state modern Hopfield networks under N-dimensional spherical geometric constraints. Employing tools from statistical physics, thermodynamic phase transition theory, and spherical geometry, the work systematically compares the retrieval dynamics of two activation kernels—LSE and LSR—specifically Gaussian and Epanechnikov types. Theoretical analysis reveals that geometric entropy depends solely on the spherical structure and is independent of the kernel choice. Notably, the LSR kernel, owing to its compact support, suppresses spurious states at low load ratios (α), thereby enabling a novel robust retrieval mechanism. The study establishes that the theoretical maximum capacity reaches α = 0.5 in the zero-temperature limit and delineates fundamental differences in the phase diagram structures between the two kernels, precisely characterizing the robustness boundary for high-capacity associative memory.
📝 Abstract
We study the thermodynamic memory capacity of modern Hopfield networks (Dense Associative Memory models) with continuous states under geometric constraints, extending classical analyses of pairwise associative memory. We derive thermodynamic phase boundaries for Dense Associative Memory networks with exponential capacity $p = e^{αN}$, comparing Gaussian (LSE) and Epanechnikov (LSR) kernels. For continuous neurons on an $N$-sphere, the geometric entropy depends solely on the spherical geometry, not the kernel. In the sharp-kernel regime, the maximum theoretical capacity $α= 0.5$ is achieved at zero temperature; below this threshold, a critical line separates retrieval from a spin-glass phase. The two kernels differ qualitatively in their phase boundary structure: for LSE, the retrieval region extends to arbitrarily high temperatures as $α\to 0$, but interference from spurious patterns is always present. For LSR, the finite support introduces a threshold $α_{\text{th}}$ below which no spurious patterns contribute to the noise floor, producing a qualitatively different retrieval regime in this sub-threshold region. These results advance the theory of high-capacity associative memory and clarify fundamental limits of retrieval robustness in modern attention-like memory architectures.
Problem

Research questions and friction points this paper is trying to address.

Geometric Entropy
Associative Memory
Phase Transitions
Memory Capacity
Continuous States
Innovation

Methods, ideas, or system contributions that make the work stand out.

Geometric Entropy
Dense Associative Memory
Phase Transition
Retrieval Robustness
Kernel Support
🔎 Similar Papers
No similar papers found.
T
Tatiana Petrova
Interdisciplinary Centre for Security, Reliability and Trust (SnT), University of Luxembourg, Luxembourg
E
Evgeny Polyachenko
Interdisciplinary Centre for Security, Reliability and Trust (SnT), University of Luxembourg, Luxembourg
Radu State
Radu State
University of Luxembourg
Network SecurityNetwork and Service management