🤖 AI Summary
To address the need for safe physical human–robot interaction, this paper investigates real-time scene flow estimation from low-density, high-noise point clouds acquired by distributed miniature time-of-flight (ToF) sensors mounted on a robot’s body. We propose a dense motion estimation framework integrating clustering with an enhanced iterative closest point (ICP) algorithm. Our approach introduces a novel fitness-driven mechanism for classifying points as static or dynamic, coupled with an adaptive inlier rejection strategy to mitigate noise and sparsity effects. Furthermore, we incorporate match-quality-based dynamic thresholding and robust geometric correspondence optimization. Evaluated on a 24-sensor experimental platform, the method achieves motion direction and velocity estimation errors comparable to the intrinsic noise level of the ToF sensors. The framework demonstrates strong practicality and robustness under realistic operating conditions.
📝 Abstract
Tracking motions of humans or objects in the surroundings of the robot is essential to improve safe robot motions and reactions. In this work, we present an approach for scene flow estimation from low-density and noisy point clouds acquired from miniaturized Time of Flight (ToF) sensors distributed on the robot body. The proposed method clusters points from consecutive frames and applies Iterative Closest Point (ICP) to estimate a dense motion flow, with additional steps introduced to mitigate the impact of sensor noise and low-density data points. Specifically, we employ a fitness-based classification to distinguish between stationary and moving points and an inlier removal strategy to refine geometric correspondences. The proposed approach is validated in an experimental setup where 24 ToF are used to estimate the velocity of an object moving at different controlled speeds. Experimental results show that the method consistently approximates the direction of the motion and its magnitude with an error which is in line with sensor noise.