π€ AI Summary
This work identifies *activation signal collapse*βa drastic decay in layer-wise activation varianceβas the fundamental cause of severe accuracy degradation in one-shot neural network pruning, challenging the conventional attribution to erroneous weight removal. To address this, we propose **REFLOW**, a gradient-free, fine-tuning-free pruning method that requires no weight updates or second-order approximations. REFLOW directly corrects signal distribution via layer-wise variance normalization and rescaling during forward propagation, leveraging only activation statistics. It recovers high-quality sparse subnetworks within the original parameter space. Experiments demonstrate that on ResNeXt-101/ImageNet with 20% remaining weights, REFLOW boosts top-1 accuracy from 4.1% (baseline pruned model) to 78.9%, substantially outperforming state-of-the-art pruning methods. This is the first work to establish signal collapse as the core failure mechanism in one-shot pruning and to introduce a low-overhead, gradient-free paradigm for effective single-step sparsification.
π Abstract
Neural network pruning is essential for reducing model complexity to enable deployment on resource constrained hardware. While performance loss of pruned networks is often attributed to the removal of critical parameters, we identify signal collapse a reduction in activation variance across layers as the root cause. Existing one shot pruning methods focus on weight selection strategies and rely on computationally expensive second order approximations. In contrast, we demonstrate that mitigating signal collapse, rather than optimizing weight selection, is key to improving accuracy of pruned networks. We propose REFLOW that addresses signal collapse without updating trainable weights, revealing high quality sparse sub networks within the original parameter space. REFLOW enables magnitude pruning to achieve state of the art performance, restoring ResNeXt101 accuracy from under 4.1% to 78.9% on ImageNet with only 20% of the weights retained, surpassing state of the art approaches.