🤖 AI Summary
To address feature distortion caused by spherical distortion and ultra-wide field-of-view in monocular 360° depth estimation, this paper proposes a distortion-aware PanoGabor filtering and lossless fusion framework. Our method introduces three key innovations: (1) a novel latitude-aware PanoGabor filter that incorporates spherical gradient constraints into Gabor frequency-domain analysis for enhanced directional sensitivity and stability; (2) a channel-spatial unidirectional fusion module (CS-UFM) enabling lossless multi-scale feature fusion directly in equirectangular projection (ERP) format; and (3) a latitude-adaptive distortion modeling mechanism. Evaluated on three mainstream indoor 360° depth datasets, our approach achieves significant improvements over state-of-the-art methods, demonstrating superior distortion robustness and native ERP compatibility while delivering high-accuracy monocular spherical depth estimation.
📝 Abstract
Depth estimation from a monocular 360 image is important to the perception of the entire 3D environment. However, the inherent distortion and large field of view (FoV) in 360 images pose great challenges for this task. To this end, existing mainstream solutions typically introduce additional perspective-based 360 representations ( extit{e.g.}, Cubemap) to achieve effective feature extraction. Nevertheless, regardless of the introduced representations, they eventually need to be unified into the equirectangular projection (ERP) format for the subsequent depth estimation, which inevitably reintroduces the troublesome distortions. In this work, we propose an oriented distortion-aware Gabor Fusion framework (PGFuse) to address the above challenges. First, we introduce Gabor filters that analyze texture in the frequency domain, thereby extending the receptive fields and enhancing depth cues. To address the reintroduced distortions, we design a linear latitude-aware distortion representation method to generate customized, distortion-aware Gabor filters (PanoGabor filters). Furthermore, we design a channel-wise and spatial-wise unidirectional fusion module (CS-UFM) that integrates the proposed PanoGabor filters to unify other representations into the ERP format, delivering effective and distortion-free features. Considering the orientation sensitivity of the Gabor transform, we introduce a spherical gradient constraint to stabilize this sensitivity. Experimental results on three popular indoor 360 benchmarks demonstrate the superiority of the proposed PGFuse to existing state-of-the-art solutions. Code can be available upon acceptance.