🤖 AI Summary
To address the low reconstruction fidelity of distant objects and the difficulty in achieving unified near–far scale modeling in outdoor unbounded scenes using 3D Gaussian Splatting (3DGS), this work introduces homogeneous coordinates into the 3DGS framework for the first time. Leveraging projective geometry, it enables a unified geometric representation for both near and far objects—eliminating reliance on conventional multi-scale or hierarchical designs. We propose homogeneous coordinate transformation, a compatible differentiable projection sampling scheme, refined density modeling, and an enhanced rendering pipeline. Experiments demonstrate that our method significantly improves rendering quality for distant objects (+2.1 dB PSNR) while preserving high-fidelity near-field reconstruction and real-time performance; training efficiency remains comparable to standard 3DGS. By transcending the limitations of Cartesian coordinates, this work establishes a novel paradigm for open-world 3D reconstruction.
📝 Abstract
Novel view synthesis has demonstrated impressive progress recently, with 3D Gaussian splatting (3DGS) offering efficient training time and photorealistic real-time rendering. However, reliance on Cartesian coordinates limits 3DGS's performance on distant objects, which is important for reconstructing unbounded outdoor environments. We found that, despite its ultimate simplicity, using homogeneous coordinates, a concept on the projective geometry, for the 3DGS pipeline remarkably improves the rendering accuracies of distant objects. We therefore propose Homogeneous Gaussian Splatting (HoGS) incorporating homogeneous coordinates into the 3DGS framework, providing a unified representation for enhancing near and distant objects. HoGS effectively manages both expansive spatial positions and scales particularly in outdoor unbounded environments by adopting projective geometry principles. Experiments show that HoGS significantly enhances accuracy in reconstructing distant objects while maintaining high-quality rendering of nearby objects, along with fast training speed and real-time rendering capability. Our implementations are available on our project page https://kh129.github.io/hogs/.