Local learning for stable backpropagation-free neural network training towards physical learning

📅 2026-03-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of in situ training in physical neural networks, which is hindered by the incompatibility with backpropagation and automatic differentiation. To overcome this limitation, the authors propose FFzero, a novel framework that enables stable training of multilayer networks without relying on backpropagation or automatic differentiation for the first time. FFzero integrates layer-wise local learning, prototype-based representations, and a directional derivative–based optimization strategy, complemented by a purely forward evaluation mechanism applicable to both multilayer perceptrons and convolutional architectures. Experimental results demonstrate that FFzero achieves strong performance on classification and regression tasks and successfully validates the feasibility of in situ physical learning in a simulated photonic neural network, offering a promising pathway toward trainable physical intelligent systems.

Technology Category

Application Category

📝 Abstract
While backpropagation and automatic differentiation have driven deep learning's success, the physical limits of chip manufacturing and rising environmental costs of deep learning motivate alternative learning paradigms such as physical neural networks. However, most existing physical neural networks still rely on digital computing for training, largely because backpropagation and automatic differentiation are difficult to realize in physical systems. We introduce FFzero, a forward-only learning framework enabling stable neural network training without backpropagation or automatic differentiation. FFzero combines layer-wise local learning, prototype-based representations, and directional-derivative-based optimization through forward evaluations only. We show that local learning is effective under forward-only optimization, where backpropagation fails. FFzero generalizes to multilayer perceptron and convolutional neural networks across classification and regression. Using a simulated photonic neural network as an example, we demonstrate that FFzero provides a viable path toward backpropagation-free in-situ physical learning.
Problem

Research questions and friction points this paper is trying to address.

physical neural networks
backpropagation-free
in-situ learning
forward-only training
local learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

forward-only learning
local learning
physical neural networks
backpropagation-free
directional derivative optimization
🔎 Similar Papers
No similar papers found.
Y
Yaqi Guo
Department of Materials Science and Engineering, Delft University of Technology, Mekelweg 2, Delft, 2628CD, The Netherlands
Fabian Braun
Fabian Braun
Centre Suisse d'Electronique et de Microtechnique (CSEM), Switzerland
PPGSpO2sleep apneablood pressureEIT
B
Bastiaan Ketelaar
Department of Precision and Microsystems Engineering, Delft University of Technology, Mekelweg 2, Delft, 2628CD, The Netherlands
Stephanie Tan
Stephanie Tan
Delft University of Technology
Social signal processingHuman behavior analysisMultimodal interactionAffective computing
R
Richard Norte
Department of Precision and Microsystems Engineering, Delft University of Technology, Mekelweg 2, Delft, 2628CD, The Netherlands
Siddhant Kumar
Siddhant Kumar
Doctoral student, University of Canterbury
BiochemistryProtein misfoldingNeurodegeneration