E-3DGS: Event-Based Novel View Rendering of Large-Scale Scenes Using 3D Gaussian Splatting

📅 2025-02-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Event cameras face significant bottlenecks in large-scale, unbounded novel-view synthesis—existing methods are largely confined to forward-facing or object-centric settings and struggle with low-light conditions, high-speed motion, and high dynamic range scenes. Method: This work introduces 3D Gaussian splatting into an event-driven neural rendering framework for the first time. We propose a sparse spatiotemporal event voxelization model and an end-to-end differentiable rendering architecture. Contribution/Results: We construct the first real-world–synthetic hybrid event dataset tailored for large-scale scenes. Experiments demonstrate that our method achieves PSNR improvements of 11–25 dB over EventNeRF, accelerates reconstruction and rendering by several orders of magnitude, and enables, for the first time, high-fidelity, real-time novel-view synthesis in large-scale environments.

Technology Category

Application Category

📝 Abstract
Novel view synthesis techniques predominantly utilize RGB cameras, inheriting their limitations such as the need for sufficient lighting, susceptibility to motion blur, and restricted dynamic range. In contrast, event cameras are significantly more resilient to these limitations but have been less explored in this domain, particularly in large-scale settings. Current methodologies primarily focus on front-facing or object-oriented (360-degree view) scenarios. For the first time, we introduce 3D Gaussians for event-based novel view synthesis. Our method reconstructs large and unbounded scenes with high visual quality. We contribute the first real and synthetic event datasets tailored for this setting. Our method demonstrates superior novel view synthesis and consistently outperforms the baseline EventNeRF by a margin of 11-25% in PSNR (dB) while being orders of magnitude faster in reconstruction and rendering.
Problem

Research questions and friction points this paper is trying to address.

Event-based novel view synthesis in large-scale scenes
Overcoming RGB camera limitations like lighting and motion blur
First use of 3D Gaussians for event-based rendering
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses 3D Gaussian Splatting
Focuses on event cameras
Enhances large-scale scene rendering
🔎 Similar Papers
No similar papers found.