Beyond Parameter Finetuning: Test-Time Representation Refinement for Node Classification

📅 2026-01-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Graph neural networks suffer significant performance degradation under out-of-distribution (OOD) test scenarios, and existing test-time adaptation methods are prone to catastrophic forgetting. To address this challenge, this work proposes TTReFT, a novel test-time representation fine-tuning framework that shifts the adaptation objective from parameter adjustment to intervention in latent representations. TTReFT integrates three key innovations: uncertainty-guided node selection, low-rank representation updating, and an intervention-aware dynamic masked autoencoder. Extensive experiments across five benchmark datasets demonstrate that TTReFT consistently outperforms current methods in OOD settings, effectively balancing adaptation efficiency with model stability.

Technology Category

Application Category

📝 Abstract
Graph Neural Networks frequently exhibit significant performance degradation in the out-of-distribution test scenario. While test-time training (TTT) offers a promising solution, existing Parameter Finetuning (PaFT) paradigm suffer from catastrophic forgetting, hindering their real-world applicability. We propose TTReFT, a novel Test-Time Representation FineTuning framework that transitions the adaptation target from model parameters to latent representations. Specifically, TTReFT achieves this through three key innovations: (1) uncertainty-guided node selection for specific interventions, (2) low-rank representation interventions that preserve pre-trained knowledge, and (3) an intervention-aware masked autoencoder that dynamically adjust masking strategy to accommodate the node selection scheme. Theoretically, we establish guarantees for TTReFT in OOD settings. Empirically, extensive experiments across five benchmark datasets demonstrate that TTReFT achieves consistent and superior performance. Our work establishes representation finetuning as a new paradigm for graph TTT, offering both theoretical grounding and immediate practical utility for real-world deployment.
Problem

Research questions and friction points this paper is trying to address.

out-of-distribution
graph neural networks
test-time training
catastrophic forgetting
node classification
Innovation

Methods, ideas, or system contributions that make the work stand out.

Test-Time Training
Representation FineTuning
Graph Neural Networks
Out-of-Distribution Generalization
Low-Rank Intervention
🔎 Similar Papers
No similar papers found.
J
Jiaxin Zhang
National University of Defense Technology
Y
Yiqi Wang
National University of Defense Technology
Siwei Wang
Siwei Wang
National University of Defense Technology
Large-graph studymulti-view fusionmulti-view clustering
Xihong Yang
Xihong Yang
NUDT & NUS
Graph Neural NetworkRecommender SystemMulti-modal/view Learning
Y
Yu Shi
National University of Defense Technology
Xinwang Liu
Xinwang Liu
Senior Member of IEEE and CCF, NUDT
Multiple Kernel LearningMulti-view Clustering
E
En Zhu
National University of Defense Technology