Model Connectomes: A Generational Approach to Data-Efficient Language Models

📅 2025-04-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Enhancing language models’ learning efficiency and neural-behavioral alignment under low-data regimes remains challenging. Method: Inspired by dual mechanisms in biological neural systems—phylogenetic evolution and ontogenetic learning—we propose a “meta-evolutionary outer loop + adaptive inner loop” framework. Crucially, we introduce a heritable, sparse “model connectome” as a structural prior: the outer loop employs evolutionary algorithms to optimize initial connection topology, while the inner loop performs supervised fine-tuning on only 100M tokens. Contribution/Results: The model undergoes cognitive alignment evaluation—including human behavioral metrics and fMRI responses—and achieves performance comparable to or exceeding same-scale baselines across diverse NLP tasks and neurocognitive benchmarks. It demonstrates significantly improved few-shot generalization and data efficiency, validating the efficacy of evolution-informed architectural priors for low-resource language modeling.

Technology Category

Application Category

📝 Abstract
Biological neural networks are shaped both by evolution across generations and by individual learning within an organism's lifetime, whereas standard artificial neural networks undergo a single, large training procedure without inherited constraints. In this preliminary work, we propose a framework that incorporates this crucial generational dimension - an"outer loop"of evolution that shapes the"inner loop"of learning - so that artificial networks better mirror the effects of evolution and individual learning in biological organisms. Focusing on language, we train a model that inherits a"model connectome"from the outer evolution loop before exposing it to a developmental-scale corpus of 100M tokens. Compared with two closely matched control models, we show that the connectome model performs better or on par on natural language processing tasks as well as alignment to human behavior and brain data. These findings suggest that a model connectome serves as an efficient prior for learning in low-data regimes - narrowing the gap between single-generation artificial models and biologically evolved neural networks.
Problem

Research questions and friction points this paper is trying to address.

Incorporating generational evolution into artificial neural networks
Improving language model performance with inherited connectomes
Bridging gap between artificial and biological neural networks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Generational evolution loop shapes learning
Model connectome inherits prior from evolution
Efficient learning in low-data regimes achieved
🔎 Similar Papers
No similar papers found.