Purge-Gate: Backpropagation-Free Test-Time Adaptation for Point Clouds Classification via Token Purging

📅 2025-09-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address test-time performance degradation in 3D point cloud classification caused by domain shift, this paper proposes Token Purging (PG), a backward-propagation-free test-time adaptation method that dynamically prunes tokens heavily affected by domain shift prior to ViT attention layers. PG introduces two lightweight variants: PG-SP, which leverages source-domain statistics, and PG-SF, guided by the CLS token—both require no source-model fine-tuning. To our knowledge, PG is the first gradient-free, token-level test-time adaptation approach for 3D vision. On ModelNet40-C, ShapeNet-C, and ScanObjectNN-C, PG-SP achieves an average accuracy gain of 10.3% over the previous best backpropagation-free methods. PG-SF further establishes a new state-of-the-art, accelerating inference by 12.4× and reducing memory consumption by 5.5× compared to existing approaches.

Technology Category

Application Category

📝 Abstract
Test-time adaptation (TTA) is crucial for mitigating performance degradation caused by distribution shifts in 3D point cloud classification. In this work, we introduce Token Purging (PG), a novel backpropagation-free approach that removes tokens highly affected by domain shifts before they reach attention layers. Unlike existing TTA methods, PG operates at the token level, ensuring robust adaptation without iterative updates. We propose two variants: PG-SP, which leverages source statistics, and PG-SF, a fully source-free version relying on CLS-token-driven adaptation. Extensive evaluations on ModelNet40-C, ShapeNet-C, and ScanObjectNN-C demonstrate that PG-SP achieves an average of +10.3% higher accuracy than state-of-the-art backpropagation-free methods, while PG-SF sets new benchmarks for source-free adaptation. Moreover, PG is 12.4 times faster and 5.5 times more memory efficient than our baseline, making it suitable for real-world deployment. Code is available at hyperlink{https://github.com/MosyMosy/Purge-Gate}{https://github.com/MosyMosy/Purge-Gate}
Problem

Research questions and friction points this paper is trying to address.

Mitigating performance degradation from 3D point cloud distribution shifts
Removing domain-shift affected tokens before attention layers
Enabling backpropagation-free test-time adaptation for real-world deployment
Innovation

Methods, ideas, or system contributions that make the work stand out.

Token Purging removes domain-shifted tokens
Backpropagation-free adaptation via attention layer filtering
CLS-token-driven source-free variant achieves benchmark performance