Energy Costs and Neural Complexity Evolution in Changing Environments

📅 2025-11-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates how environmental seasonality and energetic constraints jointly shape the evolution of neural complexity, testing the Cognitive Buffer Hypothesis (CBH) and the Expensive Brain Hypothesis (EBH). Method: Using an evolutionary reinforcement learning framework, we evolved autonomous agents with artificial neural networks in dynamically varying environments, under dual selective pressures—metabolic cost and environmental variability. Contribution/Results: Contrary to CBH predictions, high seasonality did not drive expansion of network size or predictive capacity; instead, it favored smaller, sparser, and computationally more efficient architectures. Energetic constraints emerged as the dominant selective force, significantly increasing information-processing efficiency per unit energy. This work provides the first quantitative, controlled-evolution validation of EBH’s core mechanism, demonstrating that “efficiency optimization”—not “size scaling”—constitutes the primary evolutionary trajectory of neural complexity under metabolic limitation.

Technology Category

Application Category

📝 Abstract
The Cognitive Buffer Hypothesis (CBH) posits that larger brains evolved to enhance survival in changing conditions. However, larger brains also carry higher energy demands, imposing additional metabolic burdens. Alongside brain size, brain organization plays a key role in cognitive ability and, with suitable architectures, may help mitigate energy challenges. This study evolves Artificial Neural Networks (ANNs) used by Reinforcement Learning (RL) agents to investigate how environmental variability and energy costs influence the evolution of neural complexity, defined in terms of ANN size and structure. Results indicate that under energy constraints, increasing seasonality led to smaller ANNs. This challenges CBH and supports the Expensive Brain Hypothesis (EBH), as highly seasonal environments reduced net energy intake and thereby constrained brain size. ANN structural complexity primarily emerged as a byproduct of size, where energy costs promoted the evolution of more efficient networks. These results highlight the role of energy constraints in shaping neural complexity, offering in silico support for biological theory and energy-efficient robotic design.
Problem

Research questions and friction points this paper is trying to address.

Investigates how environmental variability and energy costs influence neural complexity evolution
Examines whether brain organization can mitigate energy challenges of larger brains
Tests Cognitive Buffer Hypothesis against Expensive Brain Hypothesis under energy constraints
Innovation

Methods, ideas, or system contributions that make the work stand out.

Evolving neural networks under energy constraints
Investigating environmental variability impact on complexity
Promoting efficient network structures through energy costs
🔎 Similar Papers
No similar papers found.