π€ AI Summary
Existing 3D reconstruction methods struggle to jointly model surface geometry of both transparent and opaque objects, exhibiting strong sensitivity to material properties. This work presents the first generalizable multi-view surface reconstruction framework for heterogeneous materials. We theoretically establish that, in NeuSβs implicit signed distance field, transparent and opaque surfaces correspond respectively to non-negative local minima and the zero-level set. Leveraging this insight, we propose a dual-modal isosurface extraction mechanism driven by DCUDF (Dual-Critical Unsigned Distance Field). Our method extends Ξ±-NeuS with differentiable distance field modeling and is rigorously evaluated on a novel synthetic-real hybrid benchmark encompassing diverse material types. On our newly constructed multi-material benchmark, the approach achieves simultaneous high-fidelity reconstruction of both transparent and opaque surfaces, significantly improving robustness in complex, mixed-material scenes. Code and data are publicly released.
π Abstract
Traditional 3D shape reconstruction techniques from multi-view images, such as structure from motion and multi-view stereo, face challenges in reconstructing transparent objects. Recent advances in neural radiance fields and its variants primarily address opaque or transparent objects, encountering difficulties to reconstruct both transparent and opaque objects simultaneously. This paper introduces $alpha$-Neus -- an extension of NeuS -- that proves NeuS is unbiased for materials from fully transparent to fully opaque. We find that transparent and opaque surfaces align with the non-negative local minima and the zero iso-surface, respectively, in the learned distance field of NeuS. Traditional iso-surfacing extraction algorithms, such as marching cubes, which rely on fixed iso-values, are ill-suited for such data. We develop a method to extract the transparent and opaque surface simultaneously based on DCUDF. To validate our approach, we construct a benchmark that includes both real-world and synthetic scenes, demonstrating its practical utility and effectiveness. Our data and code are publicly available at https://github.com/728388808/alpha-NeuS.