🤖 AI Summary
Spacecraft attitude estimation under extreme illumination conditions remains challenging—RGB sensors suffer from glare-induced artifacts, while event cameras exhibit low spatial resolution and poor signal-to-noise ratio during slow motion. Method: We propose a tightly coupled RGB-event heterogeneous sensor fusion framework: optical and temporal alignment is achieved via a beam-splitter prism; a RANSAC-guided dynamic-weight feature fusion strategy is designed, augmented by a Dropout-based uncertainty quantification mechanism to enhance robustness. Contribution/Results: Evaluated on a novel, self-collected multi-illumination real-world dataset, our method significantly improves attitude estimation accuracy and stability—reducing failure rate by 62% under abrupt strong/low-light transitions. The end-to-end pipeline and the publicly released dataset establish a new benchmark for future research in robust spacecraft perception.
📝 Abstract
Spacecraft pose estimation is crucial for autonomous in-space operations, such as rendezvous, docking and on-orbit servicing. Vision-based pose estimation methods, which typically employ RGB imaging sensors, is a compelling solution for spacecraft pose estimation, but are challenged by harsh lighting conditions, which produce imaging artifacts such as glare, over-exposure, blooming and lens flare. Due to their much higher dynamic range, neuromorphic or event sensors are more resilient to extreme lighting conditions. However, event sensors generally have lower spatial resolution and suffer from reduced signal-to-noise ratio during periods of low relative motion. This work addresses these individual sensor limitations by introducing a sensor fusion approach combining RGB and event sensors. A beam-splitter prism was employed to achieve precise optical and temporal alignment. Then, a RANSAC-based technique was developed to fuse the information from the RGB and event channels to achieve pose estimation that leveraged the strengths of the two modalities. The pipeline was complemented by dropout uncertainty estimation to detect extreme conditions that affect either channel. To benchmark the performance of the proposed event-RGB fusion method, we collected a comprehensive real dataset of RGB and event data for satellite pose estimation in a laboratory setting under a variety of challenging illumination conditions. Encouraging results on the dataset demonstrate the efficacy of our event-RGB fusion approach and further supports the usage of event sensors for spacecraft pose estimation. To support community research on this topic, our dataset will be released publicly.