Rendering Anywhere You See: Renderability Field-guided Gaussian Splatting

📅 2025-04-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address unstable novel-view synthesis under sparse and non-uniform input viewpoints, this paper proposes a robust view synthesis method for complex scenes. Our approach introduces three key innovations: (1) a Renderability Field that explicitly models spatial non-uniformity across input views to guide geometry-aware sampling of pseudo-views; (2) a hybrid data optimization strategy jointly enforcing geometric consistency on pseudo-views and texture fidelity on source views; and (3) an integrated rendering pipeline combining Gaussian splatting with a lightweight image restoration network to achieve end-to-end mapping from point-cloud projections to photorealistic RGB outputs. Evaluated on both synthetic and real-world datasets, our method significantly improves rendering stability for wide-baseline novel-view synthesis, outperforming state-of-the-art methods across comprehensive quantitative metrics.

Technology Category

Application Category

📝 Abstract
Scene view synthesis, which generates novel views from limited perspectives, is increasingly vital for applications like virtual reality, augmented reality, and robotics. Unlike object-based tasks, such as generating 360{deg} views of a car, scene view synthesis handles entire environments where non-uniform observations pose unique challenges for stable rendering quality. To address this issue, we propose a novel approach: renderability field-guided gaussian splatting (RF-GS). This method quantifies input inhomogeneity through a renderability field, guiding pseudo-view sampling to enhanced visual consistency. To ensure the quality of wide-baseline pseudo-views, we train an image restoration model to map point projections to visible-light styles. Additionally, our validated hybrid data optimization strategy effectively fuses information of pseudo-view angles and source view textures. Comparative experiments on simulated and real-world data show that our method outperforms existing approaches in rendering stability.
Problem

Research questions and friction points this paper is trying to address.

Enhancing scene view synthesis from limited perspectives
Addressing non-uniform observations for stable rendering quality
Improving visual consistency with renderability field guidance
Innovation

Methods, ideas, or system contributions that make the work stand out.

Renderability field-guided Gaussian splatting for scene synthesis
Image restoration model for wide-baseline pseudo-views
Hybrid data optimization fusing pseudo-view and source textures
🔎 Similar Papers
No similar papers found.