Weather-Conditioned Branch Routing for Robust LiDAR-Radar 3D Object Detection

📅 2026-04-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of degraded sensor reliability under adverse weather conditions and the limited adaptability of existing LiDAR–4D radar fusion methods. To overcome this, the authors propose a weather-condition-driven dynamic branch routing mechanism that maintains three parallel feature streams—LiDAR-only, 4D radar-only, and conditionally gated fusion—and employs a lightweight router to softly aggregate them using visual and semantic cues. Weather-supervised learning combined with diversity regularization is introduced to prevent branch collapse and enable interpretable, context-aware modality preference switching. Evaluated on the K-Radar benchmark, the method significantly enhances the robustness of 3D object detection in harsh weather, achieving state-of-the-art performance while exhibiting clear and explainable modality selection behavior.
📝 Abstract
Robust 3D object detection in adverse weather is highly challenging due to the varying reliability of different sensors. While existing LiDAR-4D radar fusion methods improve robustness, they predominantly rely on fixed or weakly adaptive pipelines, failing to dy-namically adjust modality preferences as environmental conditions change. To bridge this gap, we reformulate multi-modal perception as a weather-conditioned branch routing problem. Instead of computing a single fused output, our framework explicitly maintains three parallel 3D feature streams: a pure LiDAR branch, a pure 4D radar branch, and a condition-gated fusion branch. Guided by a condition token extracted from visual and semantic prompts, a lightweight router dynamically predicts sample-specific weights to softly aggregate these representations. Furthermore, to prevent branch collapse, we introduce a weather-supervised learning strategy with auxiliary classification and diversity regularization to enforce distinct, condition-dependent routing behaviors. Extensive experiments on the K-Radar benchmark demonstrate that our method achieves state-of-the-art performance. Furthermore, it provides explicit and highly interpretable insights into modality preferences, transparently revealing how adaptive routing robustly shifts reliance between LiDAR and 4D radar across diverse adverse-weather scenarios. The source code with be released.
Problem

Research questions and friction points this paper is trying to address.

adverse weather
LiDAR-radar fusion
3D object detection
modality reliability
dynamic routing
Innovation

Methods, ideas, or system contributions that make the work stand out.

weather-conditioned routing
LiDAR-radar fusion
adaptive multimodal perception
branch diversity regularization
3D object detection
🔎 Similar Papers