Feature-Based Instance Neighbor Discovery: Advanced Stable Test-Time Adaptation in Dynamic World

📅 2025-06-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Deep models suffer significant performance degradation under dynamically shifting test distributions. Method: This paper proposes a feature-based instance neighborhood discovery framework for test-time adaptation (TTA). It first uncovers the intrinsic cross-domain feature clustering structure, overcoming limitations of global normalization. We design a graph-structure-driven hierarchical feature disentanglement (LFD) module and a feature-aware batch normalization (FABN) mechanism; further, we introduce selective FABN (S-FABN) to enable lightweight online adaptation. The framework jointly leverages online statistical estimation and graph neural network modeling. Contribution/Results: While maintaining low computational overhead, our method achieves a 30% accuracy gain over baseline methods in dynamic multi-distribution TTA benchmarks, substantially outperforming state-of-the-art approaches.

Technology Category

Application Category

📝 Abstract
Despite progress, deep neural networks still suffer performance declines under distribution shifts between training and test domains, leading to a substantial decrease in Quality of Experience (QoE) for applications. Existing test-time adaptation (TTA) methods are challenged by dynamic, multiple test distributions within batches. We observe that feature distributions across different domains inherently cluster into distinct groups with varying means and variances. This divergence reveals a critical limitation of previous global normalization strategies in TTA, which inevitably distort the original data characteristics. Based on this insight, we propose Feature-based Instance Neighbor Discovery (FIND), which comprises three key components: Layer-wise Feature Disentanglement (LFD), Feature Aware Batch Normalization (FABN) and Selective FABN (S-FABN). LFD stably captures features with similar distributions at each layer by constructing graph structures. While FABN optimally combines source statistics with test-time distribution specific statistics for robust feature representation. Finally, S-FABN determines which layers require feature partitioning and which can remain unified, thereby enhancing inference efficiency. Extensive experiments demonstrate that FIND significantly outperforms existing methods, achieving a 30% accuracy improvement in dynamic scenarios while maintaining computational efficiency.
Problem

Research questions and friction points this paper is trying to address.

Address performance decline in neural networks under distribution shifts
Overcome challenges of dynamic test distributions in TTA
Enhance feature representation with layer-wise disentanglement and normalization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Layer-wise Feature Disentanglement for stable feature capture
Feature Aware Batch Normalization combines source and test statistics
Selective FABN enhances efficiency by optimizing layer processing
Q
Qinting Jiang
Tsinghua University, Shenzhen, China
Chuyang Ye
Chuyang Ye
Associate Professor, School of Integrated Circuits and Electronics, Beijing Institute of Technology
Medical Image Analysis
D
Dongyan Wei
Shenzhen Technology University, Shenzhen, China
B
Bingli Wang
Sichuan Agricultural University, Sichuan, China
Y
Yuan Xue
Tsinghua University, Shenzhen, China
Jingyan Jiang
Jingyan Jiang
Shen Zhen Technology University
Test-time adaptation, Embodied AI,Machine learning system
Z
Zhi Wang
Tsinghua University, Shenzhen, China