Rethinking Intelligence: Brain-like Neuron Network

๐Ÿ“… 2026-01-27
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work proposes a brain-inspired neural network paradigmโ€”BNNโ€”and introduces LuminaNet, its first instantiation: a dynamic architecture that autonomously evolves its structure without relying on convolution or self-attention mechanisms. Departing from conventional artificial neural networks that depend on handcrafted architectures and inductive biases, LuminaNet eschews fixed topologies and human-imposed priors, instead leveraging neuroscience-inspired mechanisms to enable adaptive structural reconfiguration. Experimental results demonstrate that LuminaNet achieves 11.19% and 5.46% higher accuracy than LeNet-5 and AlexNet, respectively, on CIFAR-10, outperforming various MLP and Vision Transformer (ViT) models. On the TinyStories benchmark, it attains a perplexity of 8.4 while reducing computational cost by 25% and peak memory usage by nearly 50%.

Technology Category

Application Category

๐Ÿ“ Abstract
Since their inception, artificial neural networks have relied on manually designed architectures and inductive biases to better adapt to data and tasks. With the rise of deep learning and the expansion of parameter spaces, they have begun to exhibit brain-like functional behaviors. Nevertheless, artificial neural networks remain fundamentally different from biological neural systems in structural organization, learning mechanisms, and evolutionary pathways. From the perspective of neuroscience, we rethink the formation and evolution of intelligence and proposes a new neural network paradigm, Brain-like Neural Network (BNN). We further present the first instantiation of a BNN termed LuminaNet that operates without convolutions or self-attention and is capable of autonomously modifying its architecture. We conduct extensive experiments demonstrating that LuminaNet can achieve self-evolution through dynamic architectural changes. On the CIFAR-10, LuminaNet achieves top-1 accuracy improvements of 11.19%, 5.46% over LeNet-5 and AlexNet, respectively, outperforming MLP-Mixer, ResMLP, and DeiT-Tiny among MLP/ViT architectures. On the TinyStories text generation task, LuminaNet attains a perplexity of 8.4, comparable to a single-layer GPT-2-style Transformer, while reducing computational cost by approximately 25% and peak memory usage by nearly 50%. Code and interactive structures are available at https://github.com/aaroncomo/LuminaNet.
Problem

Research questions and friction points this paper is trying to address.

artificial neural networks
biological neural systems
intelligence evolution
brain-like architecture
neural network paradigm
Innovation

Methods, ideas, or system contributions that make the work stand out.

Brain-like Neural Network
Self-evolving Architecture
Dynamic Structure Modification
Convolution-free
Neuroscience-inspired AI
๐Ÿ”Ž Similar Papers
No similar papers found.