Quantifying Accuracy of an Event-Based Star Tracker via Earth's Rotation

📅 2025-09-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Quantitative accuracy evaluation of event-camera-based star trackers remains challenging in the absence of ground-truth attitude references. To address this, this work introduces Earth’s rotation as a high-precision physical reference—first leveraging it as the ground-truth source for attitude estimation of static ground-based event cameras. By capturing sparse nocturnal event streams and integrating real-time attitude solutions with International Earth Rotation and Reference Systems Service (IERS)-published Earth rotation parameters, the method enables calibration-free, in-situ quantification of systematic errors. Experimental results demonstrate root-mean-square attitude estimation error of 18.47″ and mean absolute error of 78.84″, robustly validating the feasibility and reliability of event cameras for low-cost, low-latency star tracking. This work establishes a reproducible, physically interpretable performance evaluation paradigm for neuromorphic-vision-based astronomical navigation systems.

Technology Category

Application Category

📝 Abstract
Event-based cameras (EBCs) are a promising new technology for star tracking-based attitude determination, but prior studies have struggled to determine accurate ground truth for real data. We analyze the accuracy of an EBC star tracking system utilizing the Earth's motion as the ground truth for comparison. The Earth rotates in a regular way with very small irregularities which are measured to the level of milli-arcseconds. By keeping an event camera static and pointing it through a ground-based telescope at the night sky, we create a system where the only camera motion in the celestial reference frame is that induced by the Earth's rotation. The resulting event stream is processed to generate estimates of orientation which we compare to the International Earth Rotation and Reference System (IERS) measured orientation of the Earth. The event camera system is able to achieve a root mean squared across error of 18.47 arcseconds and an about error of 78.84 arcseconds. Combined with the other benefits of event cameras over framing sensors (reduced computation due to sparser data streams, higher dynamic range, lower energy consumption, faster update rates), this level of accuracy suggests the utility of event cameras for low-cost and low-latency star tracking. We provide all code and data used to generate our results: https://gitlab.kitware.com/nest-public/telescope_accuracy_quantification.
Problem

Research questions and friction points this paper is trying to address.

Quantifying accuracy of event-based star tracker using Earth's rotation
Establishing ground truth for event camera attitude determination systems
Evaluating orientation estimation against IERS Earth rotation measurements
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses Earth's rotation as ground truth reference
Processes event stream for orientation estimation
Compares results with IERS measured Earth orientation
🔎 Similar Papers
No similar papers found.