🤖 AI Summary
This work addresses the lack of a general and efficient training methodology for deep physical neural networks that can operate robustly and scalably on real hardware across arbitrary physical dynamics. The authors propose the Physics-Informed Bottleneck (PIB) framework, which, for the first time, localizes the information bottleneck principle to individual computational units. By enforcing matrix-based information constraints and leveraging local learning rules, PIB enables intrinsic training without requiring digital surrogate models or comparative measurements. The approach unifies supervised, unsupervised, and reinforcement learning paradigms and is compatible with diverse physical hardware platforms—including memristive and photonic systems—while exhibiting strong fault tolerance and support for geographically distributed parallel training. This significantly enhances the practicality and scalability of physical neural networks.
📝 Abstract
Deep learning has revolutionized modern society but faces growing energy and latency constraints. Deep physical neural networks (PNNs) are interconnected computing systems that directly exploit analog dynamics for energy-efficient, ultrafast AI execution. Realizing this potential, however, requires universal training methods tailored to physical intricacies. Here, we present the Physical Information Bottleneck (PIB), a general and efficient framework that integrates information theory and local learning, enabling deep PNNs to learn under arbitrary physical dynamics. By allocating matrix-based information bottlenecks to each unit, we demonstrate supervised, unsupervised, and reinforcement learning across electronic memristive chips and optical computing platforms. PIB also adapts to severe hardware faults and allows for parallel training via geographically distributed resources. Bypassing auxiliary digital models and contrastive measurements, PIB recasts PNN training as an intrinsic, scalable information-theoretic process compatible with diverse physical substrates.