🤖 AI Summary
Existing dense associative memories (DAMs) operate exclusively on vector representations and lack the capacity to model uncertainty inherent in probabilistic data.
Method: This work introduces the first DAM framework operating in the space of probability distributions, specifically the Gaussian family endowed with the Bures–Wasserstein metric. We define an energy function based on the 2-Wasserstein distance, using the Wasserstein barycenter as a fixed point, and perform dynamic retrieval via Gibbs-weighted optimal transport mappings for aggregation.
Contribution/Results: We prove that the proposed memory achieves exponential storage capacity. Experiments demonstrate high-accuracy distribution retrieval on both synthetic and real-world distributional tasks, robustness to Wasserstein perturbations, and quantifiable recovery guarantees. By unifying associative memory with generative modeling principles, this work establishes a novel paradigm bridging distributional representation learning and memory-based probabilistic reasoning.
📝 Abstract
Dense associative memories (DAMs) store and retrieve patterns via energy-functional fixed points, but existing models are limited to vector representations. We extend DAMs to probability distributions equipped with the 2-Wasserstein distance, focusing mainly on the Bures-Wasserstein class of Gaussian densities. Our framework defines a log-sum-exp energy over stored distributions and a retrieval dynamics aggregating optimal transport maps in a Gibbs-weighted manner. Stationary points correspond to self-consistent Wasserstein barycenters, generalizing classical DAM fixed points. We prove exponential storage capacity, provide quantitative retrieval guarantees under Wasserstein perturbations, and validate the model on synthetic and real-world distributional tasks. This work elevates associative memory from vectors to full distributions, bridging classical DAMs with modern generative modeling and enabling distributional storage and retrieval in memory-augmented learning.