FlatCAD: Fast Curvature Regularization of Neural SDFs for CAD Models

📅 2025-06-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Neural signed distance functions (SDFs) for CAD-level geometric modeling suffer from prohibitive computational overhead due to Gaussian curvature regularization, which requires full Hessian computation and second-order automatic differentiation—leading to excessive memory consumption and runtime. To address this, we propose a lightweight curvature regularization that constrains only the mixed second-order derivatives in the Weingarten map, bypassing full Hessian evaluation. We introduce two equivalent proxy schemes—finite-difference approximation and Hessian-vector products—achieving a favorable trade-off between accuracy and efficiency. This is the first approach to incorporate geometric priors explicitly through the intrinsic curvature structure of differentiable surfaces. On the ABC dataset, our method matches or surpasses the reconstruction fidelity of Hessian-based baselines while reducing GPU memory usage and inference time by 50%. The design enables plug-and-play integration and cross-framework deployment.

Technology Category

Application Category

📝 Abstract
Neural signed-distance fields (SDFs) have become a versatile backbone for geometric learning, yet enforcing developable, CAD-style behavior still hinges on Gaussian curvature penalties that require full Hessian evaluation and second-order automatic differentiation, both of which are costly in memory and runtime. We present a curvature proxy that regularizes only the mixed second-order term (Weingarten term), allowing the two principal curvatures to adapt freely to data while suppressing unwanted warp. Two complementary instantiations realize this idea: (i) a finite-difference proxy that replaces each Hessian entry with four forward SDF evaluations and a single first-order gradient, and (ii) an autodiff proxy that computes the same mixed derivative via one Hessian-vector product, sidestepping explicit full Hessian assembly and remaining faster in practice. Both variants converge to the exact mixed second derivative, thus preserving the intended geometric bias without incurring full second-order graphs. On the ABC benchmarks, the proxies match or exceed the reconstruction fidelity of Hessian-based baselines while reducing GPU memory use and wall-clock time by a factor of two. Because the method is drop-in and framework-agnostic, it opens a practical path toward scalable, curvature-aware SDF learning for engineering-grade shape reconstruction.
Problem

Research questions and friction points this paper is trying to address.

Enforce CAD-style behavior in neural SDFs efficiently
Reduce memory and runtime costs of curvature regularization
Improve scalability of curvature-aware SDF learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Curvature proxy regularizes mixed second-order term
Finite-difference proxy uses four forward SDF evaluations
Autodiff proxy computes via one Hessian-vector product
H
Haotian Yin
New Jersey Institute of Technology, United States
A
Aleksander Plocharski
Warsaw University of Technology, Poland
M
Mikolaj Kida
Warsaw University of Technology, Poland
Przemyslaw Musialski
Przemyslaw Musialski
New Jersey Institute of Technology
Computer GraphicsGeometry ProcessingGeometric ModelingComputational FabricationMachine Learn