From Transparent to Opaque: Rethinking Neural Implicit Surfaces with Ξ±-NeuS

πŸ“… 2024-11-08
πŸ›οΈ arXiv.org
πŸ“ˆ Citations: 1
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Existing 3D reconstruction methods struggle to jointly model surface geometry of both transparent and opaque objects, exhibiting strong sensitivity to material properties. This work presents the first generalizable multi-view surface reconstruction framework for heterogeneous materials. We theoretically establish that, in NeuS’s implicit signed distance field, transparent and opaque surfaces correspond respectively to non-negative local minima and the zero-level set. Leveraging this insight, we propose a dual-modal isosurface extraction mechanism driven by DCUDF (Dual-Critical Unsigned Distance Field). Our method extends Ξ±-NeuS with differentiable distance field modeling and is rigorously evaluated on a novel synthetic-real hybrid benchmark encompassing diverse material types. On our newly constructed multi-material benchmark, the approach achieves simultaneous high-fidelity reconstruction of both transparent and opaque surfaces, significantly improving robustness in complex, mixed-material scenes. Code and data are publicly released.

Technology Category

Application Category

πŸ“ Abstract
Traditional 3D shape reconstruction techniques from multi-view images, such as structure from motion and multi-view stereo, face challenges in reconstructing transparent objects. Recent advances in neural radiance fields and its variants primarily address opaque or transparent objects, encountering difficulties to reconstruct both transparent and opaque objects simultaneously. This paper introduces $alpha$-Neus -- an extension of NeuS -- that proves NeuS is unbiased for materials from fully transparent to fully opaque. We find that transparent and opaque surfaces align with the non-negative local minima and the zero iso-surface, respectively, in the learned distance field of NeuS. Traditional iso-surfacing extraction algorithms, such as marching cubes, which rely on fixed iso-values, are ill-suited for such data. We develop a method to extract the transparent and opaque surface simultaneously based on DCUDF. To validate our approach, we construct a benchmark that includes both real-world and synthetic scenes, demonstrating its practical utility and effectiveness. Our data and code are publicly available at https://github.com/728388808/alpha-NeuS.
Problem

Research questions and friction points this paper is trying to address.

3D modeling
transparent objects
opaque objects
Innovation

Methods, ideas, or system contributions that make the work stand out.

Ξ±-NeuS
DCUDF
Transparent and Opaque Object Reconstruction
πŸ”Ž Similar Papers
H
Haoran Zhang
Key Laboratory of System Software (CAS) and State Key Laboratory of Computer Science, Institute of Software, Chinese Academy of Sciences; University of Chinese Academy of Sciences
J
Junkai Deng
College of Computing and Data Science, Nanyang Technological University
Xuhui Chen
Xuhui Chen
San Francisco State University
Computer Science
Fei Hou
Fei Hou
Institute of Software, Chinese Academy of Sciences
Computer Graphics
Wencheng Wang
Wencheng Wang
Professor of Institute of Software, Chinese Academy of Sciences
VisualizationVirtual RealityComputer GraphicsRenderingImaging
H
Hong Qin
Department of Computer Science, Stony Brook University
C
Chen Qian
SenseTime Research
Y
Ying He
College of Computing and Data Science, Nanyang Technological University