🤖 AI Summary
To address the limited intuitiveness of conventional remote-control operation for unmanned aerial vehicles (UAVs), this paper proposes a natural gesture interaction system based on a handheld motion controller. The system integrates hand-pose estimation, six-axis inertial measurement unit (IMU) sensing, and the ExpressLRS long-range wireless protocol to enable low-latency, high-robustness mapping from multi-degree-of-freedom gestures—including thumb-index pinch/open, palm orientation, and tilt—to UAV control inputs (throttle, pitch/roll, and yaw). Crucially, it achieves unified compatibility across both real-world and simulation environments. A user study (N=32) employing the UEQ-S questionnaire yielded excellent scores for practicality (2.2) and hedonic quality (2.3), indicating strong usability and user engagement. This work advances intuitive UAV control capabilities for applications in drone racing, scientific research, and operator training.
📝 Abstract
We present an intuitive human-drone interaction system that utilizes a gesture-based motion controller to enhance the drone operation experience in real and simulated environments. The handheld motion controller enables natural control of the drone through the movements of the operator's hand, thumb, and index finger: the trigger press manages the throttle, the tilt of the hand adjusts pitch and roll, and the thumbstick controls yaw rotation. Communication with drones is facilitated via the ExpressLRS radio protocol, ensuring robust connectivity across various frequencies. The user evaluation of the flight experience with the designed drone controller using the UEQ-S survey showed high scores for both Pragmatic (mean=2.2, SD = 0.8) and Hedonic (mean=2.3, SD = 0.9) Qualities. This versatile control interface supports applications such as research, drone racing, and training programs in real and simulated environments, thereby contributing to advances in the field of human-drone interaction.