IGUANA: Immersive Guidance, Navigation, and Control for Consumer UAV

📅 2025-10-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the poor intuitiveness and weak situational awareness of conventional consumer drone control interfaces, this paper proposes IGUANA—a Mixed Reality (MR)-based immersive navigation and control system. Its key contributions are: (1) integration of high-accuracy 3D terrain mapping with drag-and-drop waypoint planning; (2) a virtual-sphere interaction metaphor that maps physical manipulation intuition to drone control; and (3) spatially registered guidance overlays fused with real-time video feed during occlusion. User studies demonstrate that IGUANA significantly reduces cognitive load (p < 0.01), improves task accuracy by 27.4%, and achieves high operational consistency (ICC = 0.92). Participants particularly praised the 3D terrain visualization and spatial guidance. Although the virtual-sphere control was rated highly intuitive, the absence of haptic feedback was identified as a key limitation for future refinement.

Technology Category

Application Category

📝 Abstract
As the markets for unmanned aerial vehicles (UAVs) and mixed reality (MR) headsets continue to grow, recent research has increasingly explored their integration, which enables more intuitive, immersive, and situationally aware control systems. We present IGUANA, an MR-based immersive guidance, navigation, and control system for consumer UAVs. IGUANA introduces three key elements beyond conventional control interfaces: (1) a 3D terrain map interface with draggable waypoint markers and live camera preview for high-level control, (2) a novel spatial control metaphor that uses a virtual ball as a physical analogy for low-level control, and (3) a spatial overlay that helps track the UAV when it is not visible with the naked eye or visual line of sight is interrupted. We conducted a user study to evaluate our design, both quantitatively and qualitatively, and found that (1) the 3D map interface is intuitive and easy to use, relieving users from manual control and suggesting improved accuracy and consistency with lower perceived workload relative to conventional dual-stick controller, (2) the virtual ball interface is intuitive but limited by the lack of physical feedback, and (3) the spatial overlay is very useful in enhancing the users' situational awareness.
Problem

Research questions and friction points this paper is trying to address.

Developing immersive UAV control using mixed reality technology
Creating intuitive navigation interfaces for consumer drone operation
Enhancing situational awareness when visual line of sight is lost
Innovation

Methods, ideas, or system contributions that make the work stand out.

3D terrain map interface with draggable waypoint markers
Spatial control metaphor using virtual ball analogy
Spatial overlay for UAV tracking beyond visual line
🔎 Similar Papers
No similar papers found.
V
Victor Victor
Chair of Software Technology, Technische Universität Dresden, Dresden, Germany
T
Tania Krisanty
Chair of Computer Graphics and Visualization, Technische Universität Dresden, Dresden, Germany
Matthew McGinity
Matthew McGinity
Technische Universität Dresden
Stefan Gumhold
Stefan Gumhold
Chair of Computer Graphics and Visualization, Technische Universität Dresden, Dresden, Germany
Uwe Aßmann
Uwe Aßmann
Chair of Software Technology, Technische Universität Dresden, Dresden, Germany