🤖 AI Summary
This work addresses the lack of mid-air multi-point haptic feedback in immersive virtual reality (VR) and teleoperation. We propose a drone-based dynamic multi-contact haptic interface comprising six lightweight five-bar linkage mechanisms mounted within a protective cage, integrated with Vicon optical motion capture for high-precision pose estimation and stable hovering. By jointly optimizing vibration intensity modulation and flight stability control, the system enables real-time aerial rendering of predefined static haptic patterns. To our knowledge, this is the first demonstration of programmable, dynamic, multi-contact haptic feedback on an aerial platform—establishing a novel paradigm for wearable-free mid-air haptic interaction. Experimental evaluation shows an average user haptic pattern recognition accuracy of 86.5%, with no statistically significant difference across patterns. Furthermore, force output consistency and flight stability are validated under realistic operational conditions.
📝 Abstract
This work presents FlyHaptics, an aerial haptic interface tracked via a Vicon optical motion capture system and built around six five-bar linkage assemblies enclosed in a lightweight protective cage. We predefined five static tactile patterns - each characterized by distinct combinations of linkage contact points and vibration intensities - and evaluated them in a grounded pilot study, where participants achieved 86.5 recognition accuracy (F(4, 35) = 1.47, p = 0.23) with no significant differences between patterns. Complementary flight demonstrations confirmed stable hover performance and consistent force output under realistic operating conditions. These pilot results validate the feasibility of drone-mounted, multi-contact haptic feedback and lay the groundwork for future integration into fully immersive VR, teleoperation, and remote interaction scenarios.