🤖 AI Summary
This work addresses the lack of systems capable of generating diverse, high-fidelity player behaviors in current VR games, a limitation that hinders automated testing and synthetic data generation. We propose the first end-to-end motion generation framework tailored for VR games, which produces realistic full-body animations conditioned on in-game object layouts. By integrating style exemplar guidance, reinforcement learning, and physics-based simulation, our approach enables controllable synthesis of player actions. Trained on the large-scale BOXRR-23 dataset, the system supports customization of agent behaviors by skill level and motion style, with a learnable scoring mechanism to optimize simulation quality. Demonstrated in *Beat Saber*, our framework successfully reproduces high-skill, diverse player behaviors, significantly enhancing the effectiveness of VR game testing and data synthesis.
📝 Abstract
We present the first motion generation system for playtesting virtual reality (VR) games. Our player model generates VR headset and handheld controller movements from in-game object arrangements, guided by style exemplars and aligned to maximize simulated gameplay score. We train on the large BOXRR-23 dataset and apply our framework on the popular VR game Beat Saber. The resulting model Robo-Saber produces skilled gameplay and captures diverse player behaviors, mirroring the skill levels and movement patterns specified by input style exemplars. Robo-Saber demonstrates promise in synthesizing rich gameplay data for predictive applications and enabling a physics-based whole-body VR playtesting agent.