Optimal Brain Connection: Towards Efficient Structural Pruning

📅 2025-08-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing structural pruning methods often neglect topological dependencies among parameters, leading to substantial performance degradation after compression. To address this, we propose a structured pruning framework that jointly optimizes connection importance and network performance. First, we introduce a Jacobian-based criterion to quantify the joint significance of parameters both within and across layers, explicitly modeling parameter interconnectivity. Second, we design an equivalent pruning mechanism that employs a lightweight autoencoder to preserve the gradient contributions of pruned connections, thereby mitigating accuracy loss during fine-tuning. Our approach integrates first-order gradient sensitivity analysis, structured pruning, and reconstruction-aware fine-tuning. Experiments demonstrate that the proposed Jacobian criterion consistently outperforms mainstream importance metrics across multiple benchmarks. Moreover, equivalent pruning yields an average Top-1 accuracy improvement of 2.3%, significantly reducing the performance drop typically incurred between pruning and fine-tuning.

Technology Category

Application Category

📝 Abstract
Structural pruning has been widely studied for its effectiveness in compressing neural networks. However, existing methods often neglect the interconnections among parameters. To address this limitation, this paper proposes a structural pruning framework termed Optimal Brain Connection. First, we introduce the Jacobian Criterion, a first-order metric for evaluating the saliency of structural parameters. Unlike existing first-order methods that assess parameters in isolation, our criterion explicitly captures both intra-component interactions and inter-layer dependencies. Second, we propose the Equivalent Pruning mechanism, which utilizes autoencoders to retain the contributions of all original connection--including pruned ones--during fine-tuning. Experimental results demonstrate that the Jacobian Criterion outperforms several popular metrics in preserving model performance, while the Equivalent Pruning mechanism effectively mitigates performance degradation after fine-tuning. Code: https://github.com/ShaowuChen/Optimal_Brain_Connection
Problem

Research questions and friction points this paper is trying to address.

Evaluating parameter saliency with inter-layer dependencies
Retaining contributions of pruned connections during fine-tuning
Improving model performance preservation in structural pruning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Jacobian Criterion evaluates structural parameter saliency
Equivalent Pruning retains pruned connection contributions
Autoencoders mitigate fine-tuning performance degradation
🔎 Similar Papers
No similar papers found.
S
Shaowu Chen
State Key Laboratory of Radio Frequency Heterogeneous Integration, Shenzhen University, China
W
Wei Ma
State Key Laboratory of Radio Frequency Heterogeneous Integration, Shenzhen University, China
B
Binhua Huang
School of Electrical and Electronic Engineering, University College Dublin, Ireland
Q
Qingyuan Wang
School of Electrical and Electronic Engineering, University College Dublin, Ireland
G
Guoxin Wang
School of Electrical and Electronic Engineering, University College Dublin, Ireland
Weize Sun
Weize Sun
Shenzhen University
TensorMultidimensional Signal ProcessingDeep Neural Networks
L
Lei Huang
State Key Laboratory of Radio Frequency Heterogeneous Integration, Shenzhen University, China
Deepu John
Deepu John
University College Dublin
Edge ComputingIoTWearable SensingBiomedical Circuits and Systems