🤖 AI Summary
To address low surface reconstruction fidelity and pervasive artifacts in unbounded scenes modeled by 3D Gaussian splatting, this paper proposes an efficient, high-fidelity mesh extraction framework. First, a hierarchical reordering mechanism replaces global depth sorting to mitigate error propagation from inaccurate depth estimation. Second, an opacity-field-guided level-set regularization and a geometric consistency loss are introduced to enhance topological stability and geometric accuracy of the reconstructed surface. Third, Gaussian depth modeling is refined, and a parallelized Marching Tetrahedra algorithm is implemented to accelerate mesh generation. Experiments demonstrate that, while preserving rendering quality, the method reduces total processing time by over 3× and accelerates mesh generation by 10× compared to prior approaches based on depth thresholding or voxelization—achieving superior surface fidelity and computational efficiency.
📝 Abstract
Recent advances in 3D Gaussian representations have significantly improved the quality and efficiency of image-based scene reconstruction. Their explicit nature facilitates real-time rendering and fast optimization, yet extracting accurate surfaces - particularly in large-scale, unbounded environments - remains a difficult task. Many existing methods rely on approximate depth estimates and global sorting heuristics, which can introduce artifacts and limit the fidelity of the reconstructed mesh. In this paper, we present Sorted Opacity Fields (SOF), a method designed to recover detailed surfaces from 3D Gaussians with both speed and precision. Our approach improves upon prior work by introducing hierarchical resorting and a robust formulation of Gaussian depth, which better aligns with the level-set. To enhance mesh quality, we incorporate a level-set regularizer operating on the opacity field and introduce losses that encourage geometrically-consistent primitive shapes. In addition, we develop a parallelized Marching Tetrahedra algorithm tailored to our opacity formulation, reducing meshing time by up to an order of magnitude. As demonstrated by our quantitative evaluation, SOF achieves higher reconstruction accuracy while cutting total processing time by more than a factor of three. These results mark a step forward in turning efficient Gaussian-based rendering into equally efficient geometry extraction.