VizFlyt: Perception-centric Pedagogical Framework For Autonomous Aerial Robots

📅 2025-03-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the lack of reliable, high-performance testing platforms in autonomous aerial robotics education, this paper proposes a perception-first Hardware-in-the-Loop (HITL) teaching framework. The framework pioneers the use of 3D Gaussian Splatting for real-time visual simulation, driven by externally tracked pose estimates to generate photorealistic, low-latency sensor data. It tightly integrates ROS, open-source hardware, and real-time rendering to achieve a system update rate exceeding 100 Hz. Key contributions include: (1) the first open-source, co-designed software–hardware HITL framework tailored for aerial robotics courses and explicitly perception-driven; and (2) the first pedagogical implementation integrating 3D Gaussian Splatting into educational simulation. The framework has been successfully deployed across multiple undergraduate and graduate courses, and is fully open-sourced—including reference code, benchmark datasets, hardware schematics, and instructional video resources.

Technology Category

Application Category

📝 Abstract
Autonomous aerial robots are becoming commonplace in our lives. Hands-on aerial robotics courses are pivotal in training the next-generation workforce to meet the growing market demands. Such an efficient and compelling course depends on a reliable testbed. In this paper, we present extit{VizFlyt}, an open-source perception-centric Hardware-In-The-Loop (HITL) photorealistic testing framework for aerial robotics courses. We utilize pose from an external localization system to hallucinate real-time and photorealistic visual sensors using 3D Gaussian Splatting. This enables stress-free testing of autonomy algorithms on aerial robots without the risk of crashing into obstacles. We achieve over 100Hz of system update rate. Lastly, we build upon our past experiences of offering hands-on aerial robotics courses and propose a new open-source and open-hardware curriculum based on extit{VizFlyt} for the future. We test our framework on various course projects in real-world HITL experiments and present the results showing the efficacy of such a system and its large potential use cases. Code, datasets, hardware guides and demo videos are available at https://pear.wpi.edu/research/vizflyt.html
Problem

Research questions and friction points this paper is trying to address.

Develops a photorealistic testing framework for aerial robotics courses
Enables safe testing of autonomy algorithms without crash risks
Proposes an open-source curriculum for hands-on aerial robotics training
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses Hardware-In-The-Loop photorealistic testing
Employs 3D Gaussian Splatting for real-time visuals
Achieves over 100Hz system update rate
🔎 Similar Papers
No similar papers found.
K
Kushagra Srivastava
Perception and Autonomous Robotics (PeAR) Group, Worcester Polytechnic Institute
R
Rutwik Kulkarni
Perception and Autonomous Robotics (PeAR) Group, Worcester Polytechnic Institute
M
Manoj Velmurugan
Perception and Autonomous Robotics (PeAR) Group, Worcester Polytechnic Institute
Nitin J. Sanket
Nitin J. Sanket
Assistant Professor, Worcester Polytechnic Institute
Computer VisionRoboticsQuadrotorsDeep LearningImage processing