SPICE-HL3: Single-Photon, Inertial, and Stereo Camera dataset for Exploration of High-Latitude Lunar Landscapes

📅 2025-06-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Lunar high-latitude regions present extreme visual challenges—including high dynamic range, elongated shadows, and near-total darkness—due to low solar elevation angles, severely degrading conventional vision-based perception. Method: This work introduces the first multimodal robotic perception dataset specifically designed for complex lunar illumination conditions. It innovatively integrates a single-photon avalanche diode (SPAD) camera for high-sensitivity, high-speed imaging under ultra-low-light conditions, synchronized with stereo RGB cameras, an inertial measurement unit (IMU), and wheel encoders. Data were collected across diverse trajectories and illumination regimes—from dawn to night—under controlled rover motion. Contribution/Results: The dataset comprises 88 sequences and 1.3 million precisely timestamp-aligned images, captured at rover speeds of 5–50 cm/s. It is the first publicly available benchmark addressing high-latitude lunar surface perception under degraded visual conditions, enabling rigorous evaluation of autonomous navigation and scientific imaging algorithms in extreme illumination scenarios.

Technology Category

Application Category

📝 Abstract
Exploring high-latitude lunar regions presents an extremely challenging visual environment for robots. The low sunlight elevation angle and minimal light scattering result in a visual field dominated by a high dynamic range featuring long, dynamic shadows. Reproducing these conditions on Earth requires sophisticated simulators and specialized facilities. We introduce a unique dataset recorded at the LunaLab from the SnT - University of Luxembourg, an indoor test facility designed to replicate the optical characteristics of multiple lunar latitudes. Our dataset includes images, inertial measurements, and wheel odometry data from robots navigating seven distinct trajectories under multiple illumination scenarios, simulating high-latitude lunar conditions from dawn to night time with and without the aid of headlights, resulting in 88 distinct sequences containing a total of 1.3M images. Data was captured using a stereo RGB-inertial sensor, a monocular monochrome camera, and for the first time, a novel single-photon avalanche diode (SPAD) camera. We recorded both static and dynamic image sequences, with robots navigating at slow (5 cm/s) and fast (50 cm/s) speeds. All data is calibrated, synchronized, and timestamped, providing a valuable resource for validating perception tasks from vision-based autonomous navigation to scientific imaging for future lunar missions targeting high-latitude regions or those intended for robots operating across perceptually degraded environments. The dataset can be downloaded from https://zenodo.org/records/13970078?preview=1, and a visual overview is available at https://youtu.be/d7sPeO50_2I. All supplementary material can be found at https://github.com/spaceuma/spice-hl3.
Problem

Research questions and friction points this paper is trying to address.

Exploring high-latitude lunar visual challenges for robots
Replicating extreme lunar lighting conditions on Earth
Validating perception tasks for lunar mission navigation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Stereo RGB-inertial sensor for data capture
Novel SPAD camera for low-light imaging
Multiple illumination scenarios simulation
🔎 Similar Papers
No similar papers found.
D
David Rodríguez-Martínez
Space Robotics Lab, Department of Systems Engineering and Automation, University of Malaga, Spain.
D
Dave van der Meer
Space Robotics Research Group, Interdisciplinary Research Center for Security, Reliability, and Trust (SnT), University of Luxembourg, Luxembourg.
Junlin Song
Junlin Song
University of Luxembourg
State EstimationSLAMCalibration
A
Abishek Bera
Space Robotics Research Group, Interdisciplinary Research Center for Security, Reliability, and Trust (SnT), University of Luxembourg, Luxembourg.
C
C. J. Pérez-del-Pulgar
Space Robotics Lab, Department of Systems Engineering and Automation, University of Malaga, Spain.
M
Miguel Angel Olivares-Mendez
Space Robotics Research Group, Interdisciplinary Research Center for Security, Reliability, and Trust (SnT), University of Luxembourg, Luxembourg.