NeuSemSlice: Towards Effective DNN Model Maintenance via Neuron-level Semantic Slicing

📅 2024-07-26
🏛️ ACM Transactions on Software Engineering and Methodology
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing mid-granularity maintenance methods for deep neural networks (DNNs) lack the capability to support neuron-level fine-grained operations. Method: This paper introduces neuron-level semantic slicing—a novel technique that pioneers the application of semantic slicing to DNN maintenance. It leverages activation pattern similarity measurement, cross-layer semantic clustering, and dynamic component merging to precisely identify, categorize, and fuse semantically similar critical neurons across layers, yielding editable, fine-grained semantic components. Contribution/Results: Unlike conventional layer-wise partitioning, our approach overcomes accuracy bottlenecks and enables diverse maintenance tasks—including model compression, adaptation to new samples, and continual learning. Experiments demonstrate consistent superiority over baselines across all maintenance tasks: the method preserves model performance while significantly improving maintenance efficiency and generalization stability.

Technology Category

Application Category

📝 Abstract
Deep Neural networks (DNNs), extensively applied across diverse disciplines, are characterized by their integrated and monolithic architectures, setting them apart from conventional software systems. This architectural difference introduces particular challenges to maintenance tasks, such as model restructure (e.g., model compression), re-adaptation (e.g., fitting new samples), and incremental development (e.g., continual knowledge accumulation). Prior research addresses these challenges by identifying task-critical neuron layers, and dividing neural networks into semantically-similar sequential modules. However, such layer-level approaches fail to precisely identify and manipulate neuron-level semantic components, restricting their applicability to finer-grained model maintenance tasks. In this work, we implement NeuSemSlice, a novel framework that introduces the semantic slicing technique to effectively identify critical neuron-level semantic components in DNN models for semantic-aware model maintenance tasks. Specifically, semantic slicing identifies, categorizes and merges critical neurons across different categories and layers according to their semantic similarity, enabling their flexibility and effectiveness in the subsequent tasks. For semantic-aware model maintenance tasks, we provide a series of novel strategies based on semantic slicing to enhance NeuSemSlice. They include semantic components (i.e., critical neurons) preservation for model restructure, critical neuron tuning for model re-adaptation, and non-critical neuron training for model incremental development. A thorough evaluation has demonstrated that NeuSemSlice significantly outperforms baselines in all three tasks.
Problem

Research questions and friction points this paper is trying to address.

Identifies neuron-level semantic components for DNN maintenance
Enables finer-grained model restructuring and adaptation tasks
Improves model compression, sample fitting, and knowledge accumulation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Neuron-level semantic slicing for DNN maintenance
Semantic similarity-based neuron categorization and merging
Novel strategies for semantic-aware model maintenance tasks
🔎 Similar Papers
No similar papers found.
S
Shide Zhou
Huazhong University of Science and Technology, China
Tianlin Li
Tianlin Li
Nanyang Technological University
AI4SESE4AITrustworthy AI
Y
Yihao Huang
Nanyang Technological University, Singapore
L
Ling Shi
Nanyang Technological University, Singapore
K
Kailong Wang
Huazhong University of Science and Technology, China
Y
Yang Liu
Nanyang Technological University, Singapore
H
Haoyu Wang
Huazhong University of Science and Technology, China