Super4DR: 4D Radar-centric Self-supervised Odometry and Gaussian-based Map Optimization

📅 2025-12-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the performance degradation of visual/LiDAR SLAM in adverse conditions, this paper proposes the first 4D radar–centric, self-supervised framework jointly optimizing odometry and Gaussian mapping. To tackle the challenges of sparse, noisy radar point clouds and incomplete, ambiguous map structure, we introduce a clustering-aware radar odometry network and a hierarchical self-supervision mechanism. Crucially, we pioneer the use of 3D Gaussians as an intermediate representation for radar maps, integrated with a radar-specific Gaussian growth strategy and multi-view geometric regularization. Experiments demonstrate that our odometry achieves a 67% accuracy improvement over prior self-supervised methods—approaching supervised performance—while significantly enhancing map structural completeness and geometric fidelity. This substantially narrows the quality gap between radar- and LiDAR-based mapping and enables high-fidelity multi-modal image rendering.

Technology Category

Application Category

📝 Abstract
Conventional SLAM systems using visual or LiDAR data often struggle in poor lighting and severe weather. Although 4D radar is suited for such environments, its sparse and noisy point clouds hinder accurate odometry estimation, while the radar maps suffer from obscure and incomplete structures. Thus, we propose Super4DR, a 4D radar-centric framework for learning-based odometry estimation and gaussian-based map optimization. First, we design a cluster-aware odometry network that incorporates object-level cues from the clustered radar points for inter-frame matching, alongside a hierarchical self-supervision mechanism to overcome outliers through spatio-temporal consistency, knowledge transfer, and feature contrast. Second, we propose using 3D gaussians as an intermediate representation, coupled with a radar-specific growth strategy, selective separation, and multi-view regularization, to recover blurry map areas and those undetected based on image texture. Experiments show that Super4DR achieves a 67% performance gain over prior self-supervised methods, nearly matches supervised odometry, and narrows the map quality disparity with LiDAR while enabling multi-modal image rendering.
Problem

Research questions and friction points this paper is trying to address.

Improves 4D radar odometry in adverse conditions
Enhances map reconstruction from sparse radar data
Enables multi-modal rendering with radar and images
Innovation

Methods, ideas, or system contributions that make the work stand out.

Cluster-aware odometry network with object-level cues
Hierarchical self-supervision for spatio-temporal consistency
3D Gaussian representation with radar-specific growth strategy