A Finite Difference Approximation of Second Order Regularization of Neural-SDFs

📅 2025-11-12
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
To address the high computational and memory overhead of curvature regularization in neural signed distance field (SDF) learning—stemming from reliance on second-order automatic differentiation—this paper proposes a lightweight finite-difference-based regularization framework. We introduce, for the first time, an O(h²)-accurate finite-difference stencil for explicit SDF curvature modeling, bypassing Hessian construction and second-order gradients entirely. The method enables plug-and-play approximations of both Gaussian curvature and rank-deficiency loss. Empirically, it matches the reconstruction accuracy of automatic-differentiation-based curvature regularization while reducing GPU memory consumption and training time by up to 50%. Moreover, it demonstrates strong robustness to sparse, incomplete, and non-CAD data. Our core contribution is achieving high-fidelity geometric regularization at the cost of only low-order differentiation, thereby significantly improving the efficiency and scalability of neural SDF learning.

Technology Category

Application Category

📝 Abstract
We introduce a finite-difference framework for curvature regularization in neural signed distance field (SDF) learning. Existing approaches enforce curvature priors using full Hessian information obtained via second-order automatic differentiation, which is accurate but computationally expensive. Others reduced this overhead by avoiding explicit Hessian assembly, but still required higher-order differentiation. In contrast, our method replaces these operations with lightweight finite-difference stencils that approximate second derivatives using the well known Taylor expansion with a truncation error of O(h^2), and can serve as drop-in replacements for Gaussian curvature and rank-deficiency losses. Experiments demonstrate that our finite-difference variants achieve reconstruction fidelity comparable to their automatic-differentiation counterparts, while reducing GPU memory usage and training time by up to a factor of two. Additional tests on sparse, incomplete, and non-CAD data confirm that the proposed formulation is robust and general, offering an efficient and scalable alternative for curvature-aware SDF learning.
Problem

Research questions and friction points this paper is trying to address.

Approximating second-order regularization for neural SDFs
Reducing computational cost of curvature regularization
Enabling efficient curvature-aware SDF learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Finite-difference stencils approximate second derivatives
Replaces Hessian with Taylor expansion for efficiency
Reduces GPU memory usage and training time
🔎 Similar Papers
No similar papers found.
H
Haotian Yin
New Jersey Institute of Technology, United States
A
Aleksander Płocharski
Warsaw University of Technology and IDEAS NCBR, Poland
M
Michal Jan Wlodarczyk
Warsaw University of Technology, Poland
Przemyslaw Musialski
Przemyslaw Musialski
New Jersey Institute of Technology
Computer GraphicsGeometry ProcessingGeometric ModelingComputational FabricationMachine Learn