Meta Pruning via Graph Metanetworks : A Meta Learning Framework for Network Pruning

📅 2025-05-24
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing pruning methods either rely on manually designed, coarse-grained heuristics or require task-specific retraining, suffering from limited generality and automation. To address this, we propose the first end-to-end automated pruning framework based on meta-networks operating on primitive graph representations: the original network is bijectively mapped to a graph structure; a graph neural network (GNN)-based meta-network then generates an equivalent, more prune-friendly architecture, enabling pruning via only forward inference and lightweight fine-tuning. Our approach innovatively integrates meta-learning, differentiable graph mapping, and neural architecture encoding—bypassing the bottleneck of handcrafted pruning criteria. Extensive experiments on CIFAR-10/100 and ImageNet demonstrate substantial improvements over state-of-the-art methods. The framework achieves minimal accuracy degradation and strong generalization across diverse architectures, including ResNet56, VGG19, and ResNet50.

Technology Category

Application Category

📝 Abstract
Network pruning, aimed at reducing network size while preserving accuracy, has attracted significant research interest. Numerous pruning techniques have been proposed over time. They are becoming increasingly effective, but more complex and harder to interpret as well. Given the inherent complexity of neural networks, we argue that manually designing pruning criteria has reached a bottleneck. To address this, we propose a novel approach in which we"use a neural network to prune neural networks". More specifically, we introduce the newly developed idea of metanetwork from meta-learning into pruning. A metanetwork is a network that takes another network as input and produces a modified network as output. In this paper, we first establish a bijective mapping between neural networks and graphs, and then employ a graph neural network as our metanetwork. We train a metanetwork that learns the pruning strategy automatically which can transform a network that is hard to prune into another network that is much easier to prune. Once the metanetwork is trained, our pruning needs nothing more than a feedforward through the metanetwork and the standard finetuning to prune at state-of-the-art. Our method achieved outstanding results on many popular and representative pruning tasks (including ResNet56 on CIFAR10, VGG19 on CIFAR100, ResNet50 on ImageNet). Our code is available at https://github.com/Yewei-Liu/MetaPruning
Problem

Research questions and friction points this paper is trying to address.

Automating pruning rules via neural networks for network compression
Achieving universal applicability across diverse network architectures
Eliminating special training requirements during pruning process
Innovation

Methods, ideas, or system contributions that make the work stand out.

Meta-learning framework automates network pruning via metanetworks
Graph neural networks map networks to graphs for pruning
Pruning achieved through feedforward metanetwork without special training
🔎 Similar Papers
No similar papers found.