360-GeoGS: Geometrically Consistent Feed-Forward 3D Gaussian Splatting Reconstruction for 360 Images

📅 2026-01-05
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing feedforward 3D Gaussian splatting methods suffer from insufficient geometric consistency in 360° image reconstruction, limiting their applicability to high-precision spatial perception tasks. This work proposes a feedforward 3D Gaussian splatting framework tailored for 360° imagery, which introduces a depth-normal coupled geometric regularization mechanism to jointly optimize the position, rotation, and scale of Gaussian ellipsoids. By doing so, the method significantly enhances the geometric consistency of both point cloud and surface reconstructions while preserving high-fidelity rendering quality. To the best of our knowledge, this is the first approach within the feedforward Gaussian splatting paradigm to achieve simultaneous optimization of rendering fidelity and geometric accuracy, thereby establishing a reliable foundation for applications such as augmented reality, robotics, and digital twins.

Technology Category

Application Category

📝 Abstract
3D scene reconstruction is fundamental for spatial intelligence applications such as AR, robotics, and digital twins. Traditional multi-view stereo struggles with sparse viewpoints or low-texture regions, while neural rendering approaches, though capable of producing high-quality results, require per-scene optimization and lack real-time efficiency. Explicit 3D Gaussian Splatting (3DGS) enables efficient rendering, but most feed-forward variants focus on visual quality rather than geometric consistency, limiting accurate surface reconstruction and overall reliability in spatial perception tasks. This paper presents a novel feed-forward 3DGS framework for 360 images, capable of generating geometrically consistent Gaussian primitives while maintaining high rendering quality. A Depth-Normal geometric regularization is introduced to couple rendered depth gradients with normal information, supervising Gaussian rotation, scale, and position to improve point cloud and surface accuracy. Experimental results show that the proposed method maintains high rendering quality while significantly improving geometric consistency, providing an effective solution for 3D reconstruction in spatial perception tasks.
Problem

Research questions and friction points this paper is trying to address.

3D reconstruction
geometric consistency
3D Gaussian Splatting
360 images
spatial perception
Innovation

Methods, ideas, or system contributions that make the work stand out.

3D Gaussian Splatting
geometric consistency
Depth-Normal regularization
feed-forward reconstruction
360 image
🔎 Similar Papers
No similar papers found.
J
Jiaqi Yao
Shanghai Key Laboratory of Navigation and Location Based Services, Shanghai Jiao Tong University
Z
Zhongmiao Yan
Shanghai Key Laboratory of Navigation and Location Based Services, Shanghai Jiao Tong University
J
Jingyi Xu
Shanghai Key Laboratory of Navigation and Location Based Services, Shanghai Jiao Tong University
Songpengcheng Xia
Songpengcheng Xia
Shanghai Jiao Tong University
Deep learningWearable ComputingMotion CaptureHARHPE
Y
Yan Xiang
Shanghai Key Laboratory of Navigation and Location Based Services, Shanghai Jiao Tong University
Ling Pei
Ling Pei
Shanghai Jiao Tong University
NavigationPositioningSLAMSensor FusionGNSS