🤖 AI Summary
This study addresses model compression for Random Wired Neural Networks (RWNNs) in COVID-19 chest X-ray classification. We propose a discrete Ricci curvature-based edge pruning method, introducing Forman-Ricci curvature—computationally more efficient than Ollivier-Ricci curvature—into neural network pruning for the first time, while preserving pruning efficacy. The method is further enhanced by integrating edge betweenness centrality and systematically evaluated across Erdős–Rényi, Watts–Strogatz, and Barabási–Albert topologies to demonstrate robustness. Experiments show that our approach significantly reduces parameter count and theoretical computational cost—yielding higher speedup—while maintaining classification accuracy, sensitivity, and specificity. Moreover, it achieves a superior trade-off between preserving modular structure and sustaining global information propagation efficiency.
📝 Abstract
Randomly Wired Neural Networks (RWNNs) serve as a valuable testbed for investigating the impact of network topology in deep learning by capturing how different connectivity patterns impact both learning efficiency and model performance. At the same time, they provide a natural framework for exploring edge-centric network measures as tools for pruning and optimization. In this study, we investigate three edge-centric network measures: Forman-Ricci curvature (FRC), Ollivier-Ricci curvature (ORC), and edge betweenness centrality (EBC), to compress RWNNs by selectively retaining important synapses (or edges) while pruning the rest. As a baseline, RWNNs are trained for COVID-19 chest x-ray image classification, aiming to reduce network complexity while preserving performance in terms of accuracy, specificity, and sensitivity. We extend prior work on pruning RWNN using ORC by incorporating two additional edge-centric measures, FRC and EBC, across three network generators: Erdös-Rényi (ER) model, Watts-Strogatz (WS) model, and Barabási-Albert (BA) model. We provide a comparative analysis of the pruning performance of the three measures in terms of compression ratio and theoretical speedup. A central focus of our study is to evaluate whether FRC, which is computationally more efficient than ORC, can achieve comparable pruning effectiveness. Along with performance evaluation, we further investigate the structural properties of the pruned networks through modularity and global efficiency, offering insights into the trade-off between modular segregation and network efficiency in compressed RWNNs. Our results provide initial evidence that FRC-based pruning can effectively simplify RWNNs, offering significant computational advantages while maintaining performance comparable to ORC.