🤖 AI Summary
Horizontal-mounted LiDARs (e.g., MID360) suffer from sparse near-ground point clouds, weak terrain perception, and insufficient features, leading to degraded odometry performance in unstructured environments. Method: This paper proposes a perception-enhanced motion control framework requiring no additional hardware. Leveraging the coupling between the spherical robot’s internal differential drive and the LiDAR’s attitude, it passively enhances vertical scan diversity by superimposing bounded aperiodic oscillatory excitation signals onto trajectory tracking commands. The method integrates LiDAR–IMU tightly coupled odometry, differential-drive kinematic modeling, and perception-aware control. Results: Experiments demonstrate 96% map completeness, a 27% reduction in trajectory tracking error, significantly improved robustness in near-ground human detection, and preservation of low power consumption, lightweight design, and cost efficiency.
📝 Abstract
Autonomous mobile robots increasingly rely on LiDAR-IMU odometry for navigation and mapping, yet horizontally mounted LiDARs such as the MID360 capture few near-ground returns, limiting terrain awareness and degrading performance in feature-scarce environments. Prior solutions - static tilt, active rotation, or high-density sensors - either sacrifice horizontal perception or incur added actuators, cost, and power. We introduce PERAL, a perception-aware motion control framework for spherical robots that achieves passive LiDAR excitation without dedicated hardware. By modeling the coupling between internal differential-drive actuation and sensor attitude, PERAL superimposes bounded, non-periodic oscillations onto nominal goal- or trajectory-tracking commands, enriching vertical scan diversity while preserving navigation accuracy. Implemented on a compact spherical robot, PERAL is validated across laboratory, corridor, and tactical environments. Experiments demonstrate up to 96 percent map completeness, a 27 percent reduction in trajectory tracking error, and robust near-ground human detection, all at lower weight, power, and cost compared with static tilt, active rotation, and fixed horizontal baselines. The design and code will be open-sourced upon acceptance.