Supervised and Unsupervised protocols for hetero-associative neural networks

📅 2025-05-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limited learning capacity of three-way associative memory (TAM) in hetero-associative scenarios. We propose a generalized Hebbian coupling mechanism and, for the first time, integrate spin-glass statistical mechanics—specifically the replica method and Guerra’s interpolation technique—with Rademacher-based random modeling to systematically analyze capacity limits and performance differences under both supervised and unsupervised learning protocols. We derive self-consistent equations for order parameters and analytically determine the critical data capacity threshold. Our theoretical predictions are rigorously validated via extensive numerical simulations and exhibit strong robustness across both random and structured datasets. The analysis uncovers a three-layer collaborative computation mechanism, offering a novel paradigm and theoretical foundation for neural architectures designed to model ternary relational structures.

Technology Category

Application Category

📝 Abstract
This paper introduces a learning framework for Three-Directional Associative Memory (TAM) models, extending the classical Hebbian paradigm to both supervised and unsupervised protocols within an hetero-associative setting. These neural networks consist of three interconnected layers of binary neurons interacting via generalized Hebbian synaptic couplings that allow learning, storage and retrieval of structured triplets of patterns. By relying upon glassy statistical mechanical techniques (mainly replica theory and Guerra interpolation), we analyze the emergent computational properties of these networks, at work with random (Rademacher) datasets and at the replica-symmetric level of description: we obtain a set of self-consistency equations for the order parameters that quantify the critical dataset sizes (i.e. their thresholds for learning) and describe the retrieval performance of these networks, highlighting the differences between supervised and unsupervised protocols. Numerical simulations validate our theoretical findings and demonstrate the robustness of the captured picture about TAMs also at work with structured datasets. In particular, this study provides insights into the cooperative interplay of layers, beyond that of the neurons within the layers, with potential implications for optimal design of artificial neural network architectures.
Problem

Research questions and friction points this paper is trying to address.

Extends Hebbian learning to supervised and unsupervised hetero-associative TAM models
Analyzes computational properties of networks with random datasets using statistical mechanics
Explores layer interplay for optimal artificial neural network design
Innovation

Methods, ideas, or system contributions that make the work stand out.

Extends Hebbian paradigm to supervised and unsupervised protocols
Uses glassy statistical mechanical techniques for analysis
Analyzes cooperative interplay of layers in neural networks
🔎 Similar Papers
No similar papers found.
A
Andrea Alessandrelli
Dipartimento di Informatica, Universit`a di Pisa, Pisa Italy; Istituto Nazionale di Fisica Nucleare, Sezione di Lecce, Italy; Istituto Nazionale d’Alta Matematica, GNFM, Roma, Italy
Adriano Barra
Adriano Barra
Sapienza Università di Roma
Statistical MechanicsNeural NetworksComplex SystemsTheoretical Biology
A
Andrea Ladiana
Dipartimento di Matematica e Fisica, Universit`a del Salento, Lecce, Italy
A
Andrea Lepre
Dipartimento di Matematica e Fisica, Universit`a del Salento, Lecce, Italy
Federico Ricci-Tersenghi
Federico Ricci-Tersenghi
Professor of Theoretical Physics, "La Sapienza" University, Roma
Statistical MechanicsDisordered and Complex SystemsNumerical SimulationsOptimization Problems