GS2E: Gaussian Splatting is an Effective Data Generator for Event Stream Generation

๐Ÿ“… 2025-05-21
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Existing synthetic event datasets predominantly rely on dense RGB videos, suffering from limited viewpoint diversity, geometric inconsistency, and high hardware costs. To address these limitations, GS2E introduces the first large-scale, high-fidelity synthetic event dataset generated from sparse multi-view real-world RGB images. It first reconstructs static scenes using 3D Gaussian Splatting, then synthesizes temporally dense, geometrically consistent event streams by integrating adaptive trajectory interpolation with physically grounded contrast-threshold modelingโ€”ensuring robustness to illumination variations and motion dynamics. This approach breaks away from conventional video-based synthesis paradigms, enabling more realistic and scalable event generation. Evaluated on event-driven 3D reconstruction tasks, GS2E significantly improves model generalization across unseen scenes and viewpoints. Moreover, it establishes a new benchmark for event-based vision research, facilitating systematic evaluation of geometry-aware, lighting-robust, and motion-adaptive event processing methods.

Technology Category

Application Category

๐Ÿ“ Abstract
We introduce GS2E (Gaussian Splatting to Event), a large-scale synthetic event dataset for high-fidelity event vision tasks, captured from real-world sparse multi-view RGB images. Existing event datasets are often synthesized from dense RGB videos, which typically lack viewpoint diversity and geometric consistency, or depend on expensive, difficult-to-scale hardware setups. GS2E overcomes these limitations by first reconstructing photorealistic static scenes using 3D Gaussian Splatting, and subsequently employing a novel, physically-informed event simulation pipeline. This pipeline generally integrates adaptive trajectory interpolation with physically-consistent event contrast threshold modeling. Such an approach yields temporally dense and geometrically consistent event streams under diverse motion and lighting conditions, while ensuring strong alignment with underlying scene structures. Experimental results on event-based 3D reconstruction demonstrate GS2E's superior generalization capabilities and its practical value as a benchmark for advancing event vision research.
Problem

Research questions and friction points this paper is trying to address.

Generates high-fidelity event data from sparse RGB images
Overcomes limitations of existing synthetic event datasets
Improves event-based 3D reconstruction with realistic simulations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses 3D Gaussian Splatting for scene reconstruction
Employs adaptive trajectory interpolation method
Models event contrast threshold physically-consistently
๐Ÿ”Ž Similar Papers
No similar papers found.
Y
Yuchen Li
School of Electronic and Computer Engineering, Peking University
Chaoran Feng
Chaoran Feng
๐ŸŽ“ Peking University
3D VisionEvent-based VisionmLLM/VLM
Z
Zhenyu Tang
School of Electronic and Computer Engineering, Peking University
K
Kaiyuan Deng
Holcombe Department of Electrical and Computer Engineering, Clemson University
Wangbo Yu
Wangbo Yu
Peking University
3D VisionAIGC
Y
Yonghong Tian
School of Electronic and Computer Engineering, Peking University
Li Yuan
Li Yuan
Research Associate, University of Science & Technology of China (USTC)
Antibiotic resistanceWastewater treatmentEnvironmental bioremediationAnaerobic digestionFate of organic pollutants