CoPEFT: Fast Adaptation Framework for Multi-Agent Collaborative Perception with Parameter-Efficient Fine-Tuning

๐Ÿ“… 2025-02-15
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
To address poor generalization and high deployment costs of multi-agent collaborative perception models in novel traffic scenarios, this paper proposes a lightweight, parameter-efficient fine-tuning framework. The method introduces two novel components: (1) a collaborative adapter that aligns macro-level feature distributions across domains, and (2) an agent-specific prompt mechanism that enhances micro-level environmental context awareness. With fewer than 1% trainable parameters, the framework enables rapid cross-scenario adaptation using only a small number of deployment samplesโ€”without requiring access to the full deployment dataset. This significantly reduces training overhead and supports resource-constrained agents. Extensive experiments on multiple collaborative perception benchmarks demonstrate that our approach substantially outperforms existing domain adaptation methods, while maintaining identical inference latency and accelerating convergence by over threefold.

Technology Category

Application Category

๐Ÿ“ Abstract
Multi-agent collaborative perception is expected to significantly improve perception performance by overcoming the limitations of single-agent perception through exchanging complementary information. However, training a robust collaborative perception model requires collecting sufficient training data that covers all possible collaboration scenarios, which is impractical due to intolerable deployment costs. Hence, the trained model is not robust against new traffic scenarios with inconsistent data distribution and fundamentally restricts its real-world applicability. Further, existing methods, such as domain adaptation, have mitigated this issue by exposing the deployment data during the training stage but incur a high training cost, which is infeasible for resource-constrained agents. In this paper, we propose a Parameter-Efficient Fine-Tuning-based lightweight framework, CoPEFT, for fast adapting a trained collaborative perception model to new deployment environments under low-cost conditions. CoPEFT develops a Collaboration Adapter and Agent Prompt to perform macro-level and micro-level adaptations separately. Specifically, the Collaboration Adapter utilizes the inherent knowledge from training data and limited deployment data to adapt the feature map to new data distribution. The Agent Prompt further enhances the Collaboration Adapter by inserting fine-grained contextual information about the environment. Extensive experiments demonstrate that our CoPEFT surpasses existing methods with less than 1% trainable parameters, proving the effectiveness and efficiency of our proposed method.
Problem

Research questions and friction points this paper is trying to address.

Enhance multi-agent collaborative perception efficiency
Reduce training data collection costs
Adapt models to new traffic scenarios
Innovation

Methods, ideas, or system contributions that make the work stand out.

Parameter-Efficient Fine-Tuning
Collaboration Adapter
Agent Prompt enhancement
๐Ÿ”Ž Similar Papers
No similar papers found.
Q
Quanmin Wei
School of Computing and Artificial Intelligence, Southwest Jiaotong University; Engineering Research Center of Sustainable Urban Intelligent Transportation, Ministry of Education
Penglin Dai
Penglin Dai
Southwest Jiaotong University
Edge IntelligenceAutonomous DrivingInternet of Vehicles
W
Wei Li
School of Computing and Artificial Intelligence, Southwest Jiaotong University; Engineering Research Center of Sustainable Urban Intelligent Transportation, Ministry of Education
Bingyi Liu
Bingyi Liu
Professor, Department of CS and AI, Wuhan University of Technology
Internet of VehiclesEdge ComputingAutonomous VehiclesIntelligent Transportation Systems
X
Xiao Wu
School of Computing and Artificial Intelligence, Southwest Jiaotong University; Engineering Research Center of Sustainable Urban Intelligent Transportation, Ministry of Education