Efficient Edge Test-Time Adaptation via Latent Feature Coordinate Correction

📅 2025-10-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Edge devices face significant challenges in test-time adaptation (TTA) due to stringent computational constraints and distribution shifts. Existing TTA methods either rely on gradient-based optimization or batch processing—both incompatible with edge deployment—or adopt gradient-free strategies suffering from weak learning capability, poor flexibility, or tight architectural coupling. This paper proposes TED, a backward-pass-free, single-instance-driven forward TTA method. Its core innovation is the first application of Covariance Matrix Adaptation Evolution Strategy (CMA-ES) to TTA: it performs forward optimization of low-dimensional feature coordinates within the latent-space principal subspace, freezing model parameters while dynamically adapting search directions. Evaluated on ImageNet and Google Speech Commands, TED achieves state-of-the-art accuracy with up to 63× lower computational overhead. It is successfully deployed on a ZYNQ-7020 FPGA, demonstrating feasibility of lightweight, memoryless, and real-time adaptation on resource-constrained edge hardware.

Technology Category

Application Category

📝 Abstract
Edge devices face significant challenges due to limited computational resources and distribution shifts, making efficient and adaptable machine learning essential. Existing test-time adaptation (TTA) methods often rely on gradient-based optimization or batch processing, which are inherently unsuitable for resource-constrained edge scenarios due to their reliance on backpropagation and high computational demands. Gradient-free alternatives address these issues but often suffer from limited learning capacity, lack flexibility, or impose architectural constraints. To overcome these limitations, we propose a novel single-instance TTA method tailored for edge devices (TED), which employs forward-only coordinate optimization in the principal subspace of latent using the covariance matrix adaptation evolution strategy (CMA-ES). By updating a compact low-dimensional vector, TED not only enhances output confidence but also aligns the latent representation closer to the source latent distribution within the latent principal subspace. This is achieved without backpropagation, keeping the model parameters frozen, and enabling efficient, forgetting-free adaptation with minimal memory and computational overhead. Experiments on image classification and keyword spotting tasks across the ImageNet and Google Speech Commands series datasets demonstrate that TED achieves state-of-the-art performance while $ extit{reducing computational complexity by up to 63 times}$, offering a practical and scalable solution for real-world edge applications. Furthermore, we successfully $ extit{deployed TED on the ZYNQ-7020 platform}$, demonstrating its feasibility and effectiveness for resource-constrained edge devices in real-world deployments.
Problem

Research questions and friction points this paper is trying to address.

Adapting models efficiently on edge devices with limited resources
Overcoming computational constraints of gradient-based test-time adaptation
Correcting latent feature coordinates without backpropagation for edge deployment
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses forward-only coordinate optimization in latent subspace
Employs CMA-ES for efficient adaptation without backpropagation
Updates compact low-dimensional vector to align representations
🔎 Similar Papers
No similar papers found.
X
Xinyu Luo
Department of Electrical Engineering, City University of Hong Kong
J
Jie Liu
Department of Electrical Engineering, City University of Hong Kong
Kecheng Chen
Kecheng Chen
PhD student at EE, City University of Hong Kong
Transfer LearningAI for HealthcareSignal Processing
J
Junyi Yang
Department of Electrical Engineering, City University of Hong Kong
B
Bo Ding
Department of Electrical Engineering, City University of Hong Kong
Arindam Basu
Arindam Basu
Professor, City University of Hong Kong (past Associate Professor of EEE at NTU)
NeuromorphicAnalog ICNeuromorphic EngineeringComputing-In-MemoryBrain-machine interface
Haoliang Li
Haoliang Li
Department of Electrical Engineering, City University of Hong Kong
AI SecurityInformation Forensics and SecurityMachine Learning