๐ค AI Summary
This work addresses the challenge of achieving both efficiency and performance in backpropagation-free test-time adaptation under continual distribution shifts. Existing methods often suffer from redundant computation after domain stabilization or are constrained by input prompt updates. To overcome these limitations, the authors propose PACE, a novel framework that enables efficient adaptation by directly optimizing the affine parameters of normalization layers within a low-dimensional subspace. Key innovations include a subspace optimization strategy combining CMA-ES with Fastfood random projections, an adaptive stopping mechanism, and a cache of domain-specific vectors. Evaluated across multiple continual distribution shift benchmarks, PACE significantly outperforms current backpropagation-free approaches, reducing runtime by over 50% while achieving state-of-the-art accuracy.
๐ Abstract
We introduce PACE, a backpropagation-free continual test-time adaptation system that directly optimizes the affine parameters of normalization layers. Existing derivative-free approaches struggle to balance runtime efficiency with learning capacity, as they either restrict updates to input prompts or require continuous, resource-intensive adaptation regardless of domain stability. To address these limitations, PACE leverages the Covariance Matrix Adaptation Evolution Strategy with the Fastfood projection to optimize high-dimensional affine parameters within a low-dimensional subspace, leading to superior adaptive performance. Furthermore, we enhance the runtime efficiency by incorporating an adaptation stopping criterion and a domain-specialized vector bank to eliminate redundant computation. Our framework achieves state-of-the-art accuracy across multiple benchmarks under continual distribution shifts, reducing runtime by over 50% compared to existing backpropagation-free methods.