Lightweight User-Personalization Method for Closed Split Computing

📅 2026-03-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the performance degradation in closed-set segmentation under data distribution shifts, unreliable communication, and privacy-preserving perturbations, where model architecture and parameters are inaccessible. To tackle this challenge, we propose SALT, the first lightweight multi-objective adaptation framework that operates without any knowledge of internal model details. SALT deploys trainable lightweight adapters on the client side to refine intermediate representations from a frozen backbone, requiring no model modification or additional communication overhead. Experiments demonstrate that SALT improves personalized accuracy on CIFAR-10 from 88.1% to 93.8% while reducing training latency by over 60%. It maintains above 90% accuracy under a 75% packet loss rate and achieves approximately 88% accuracy under Gaussian noise perturbation with σ=1.0.

Technology Category

Application Category

📝 Abstract
Split Computing enables collaborative inference between edge devices and the cloud by partitioning a deep neural network into an edge-side head and a server-side tail, reducing latency and limiting exposure of raw input data. However, inference performance often degrades in practical deployments due to user-specific data distribution shifts, unreliable communication, and privacy-oriented perturbations, especially in closed environments where model architectures and parameters are inaccessible. To address this challenge, we propose SALT (Split-Adaptive Lightweight Tuning), a lightweight adaptation framework for closed Split Computing systems. SALT introduces a compact client-side adapter that refines intermediate representations produced by a frozen head network, enabling effective model adaptation without modifying the head or tail networks or increasing communication overhead. By modifying only the training conditions, SALT supports multiple adaptation objectives, including user personalization, communication robustness, and privacy-aware inference. Experiments using ResNet-18 on CIFAR-10 and CIFAR-100 show that SALT achieves higher accuracy than conventional retraining and fine-tuning while significantly reducing training cost. On CIFAR-10, SALT improves personalized accuracy from 88.1% to 93.8% while reducing training latency by more than 60%. SALT also maintains over 90% accuracy under 75% packet loss and preserves high accuracy (about 88% at sigma = 1.0) under noise injection. These results demonstrate that SALT provides an efficient and practical adaptation framework for real-world Split Computing systems.
Problem

Research questions and friction points this paper is trying to address.

Split Computing
data distribution shift
communication unreliability
privacy perturbation
closed environment
Innovation

Methods, ideas, or system contributions that make the work stand out.

Split Computing
Lightweight Adaptation
User Personalization
Closed Environment
Intermediate Representation Tuning
🔎 Similar Papers
No similar papers found.