🤖 AI Summary
Cloud gaming research has long suffered from non-reproducible QoE/QoS evaluation due to proprietary platform restrictions, unpredictable game engine behavior, and scarcity of authentic operational logs. To address this, we propose the first ordered, synchronized traffic capture-and-replay system supporting closed-loop action–response modeling. Our method leverages instruction-level logging, frame-accurate video timestamp alignment, programmable network latency injection, and precisely synchronized playback—enabling reproducible experiments across diverse network conditions. The system circumvents vendor lock-in, facilitates quantitative QoE/QoS analysis across multiple scenarios (e.g., varying bandwidth, jitter, and packet loss), and is fully open-sourced. It has become a community-standard experimental infrastructure for systematic cloud gaming evaluation.
📝 Abstract
Cloud Gaming (CG) research faces challenges due to the unpredictability of game engines and restricted access to commercial platforms and their logs. This creates major obstacles to conducting fair experimentation and evaluation. CGReplay captures and replays player commands and the corresponding video frames in an ordered and synchronized action-reaction loop, ensuring reproducibility. It enables Quality of Experience/Service (QoE/QoS) assessment under varying network conditions and serves as a foundation for broader CG research. The code is publicly available for further development.