Energy-Efficient Information Representation in MNIST Classification Using Biologically Inspired Learning

📅 2026-02-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes a biologically inspired learning rule that emulates structural plasticity in the brain to dynamically optimize synaptic connectivity during learning, thereby addressing the redundancy and high energy consumption commonly caused by over-parameterization in artificial neural networks. Applied to the MNIST classification task, the method automatically prunes unnecessary synapses while preserving essential connections and reserving capacity for new memories—all without requiring a predefined network architecture. By eschewing traditional backpropagation and integrating information-theoretic principles, the approach achieves efficient, low-redundancy representations. Experimental results demonstrate superior performance over backpropagation in terms of classification accuracy, memory capacity, and energy efficiency, offering a novel paradigm for scalable and sustainable AI systems.

Technology Category

Application Category

📝 Abstract
Efficient representation learning is essential for optimal information storage and classification. However, it is frequently overlooked in artificial neural networks (ANNs). This neglect results in networks that can become overparameterized by factors of up to 13, increasing redundancy and energy consumption. As the demand for large language models (LLMs) and their scale increase, these issues are further highlighted, raising significant ethical and environmental concerns. We analyze our previously developed biologically inspired learning rule using information-theoretic concepts, evaluating its efficiency on the MNIST classification task. The proposed rule, which emulates the brain's structural plasticity, naturally prevents overparameterization by optimizing synaptic usage and retaining only the essential number of synapses. Furthermore, it outperforms backpropagation (BP) in terms of efficiency and storage capacity. It also eliminates the need for pre-optimization of network architecture, enhances adaptability, and reflects the brain's ability to reserve 'space' for new memories. This approach advances scalable and energy-efficient AI and provides a promising framework for developing brain-inspired models that optimize resource allocation and adaptability.
Problem

Research questions and friction points this paper is trying to address.

energy efficiency
overparameterization
efficient representation learning
information storage
artificial neural networks
Innovation

Methods, ideas, or system contributions that make the work stand out.

biologically inspired learning
structural plasticity
energy-efficient AI
overparameterization
information representation
🔎 Similar Papers
No similar papers found.
P
Patrick Stricker
KEIM Institute, Albstadt-Sigmaringen University, Germany
Florian Röhrbein
Florian Röhrbein
TUC
A
Andreas Knoblauch
KEIM Institute, Albstadt-Sigmaringen University, Germany