🤖 AI Summary
Existing deep networks struggle to systematically model the synergistic mechanisms among signal strength, coupling structure, and state evolution, limiting their capacity for higher-order dynamic perception. This work proposes a novel neural architecture grounded in Kirchhoff’s Current Law, introducing circuit-theoretic physical principles into deep learning for the first time. By leveraging ordinary differential equations, the framework enables numerically stable intra-layer state updates that explicitly disentangle and encode higher-order dynamic components. The resulting model maintains physical consistency, interpretability, and end-to-end differentiability, supporting higher-order dynamics modeling within a single layer. Empirical evaluations demonstrate its superiority over state-of-the-art methods in both partial differential equation solving and ImageNet image classification tasks.
📝 Abstract
Deep learning architectures are fundamentally inspired by neuroscience, particularly the structure of the brain's sensory pathways, and have achieved remarkable success in learning informative data representations. Although these architectures mimic the communication mechanisms of biological neurons, their strategies for information encoding and transmission are fundamentally distinct. Biological systems depend on dynamic fluctuations in membrane potential; by contrast, conventional deep networks optimize weights and biases by adjusting the strengths of inter-neural connections, lacking a systematic mechanism to jointly characterize the interplay among signal intensity, coupling structure, and state evolution. To tackle this limitation, we propose the Kirchhoff-Inspired Neural Network (KINN), a state-variable-based network architecture constructed based on Kirchhoff's current law. KINN derives numerically stable state updates from fundamental ordinary differential equations, enabling the explicit decoupling and encoding of higher-order evolutionary components within a single layer while preserving physical consistency, interpretability, and end-to-end trainability. Extensive experiments on partial differential equation (PDE) solving and ImageNet image classification validate that KINN outperforms state-of-the-art existing methods.