Scheduling the Off-Diagonal Weingarten Loss of Neural SDFs for CAD Models

πŸ“… 2025-11-05
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Neural signed distance functions (SDFs) for CAD point cloud reconstruction suffer from optimization instability during early training or loss of fine geometric details in later stages when employing fixed-curvature regularization weights. To address this, we propose a dynamic scheduling mechanism for the Off-Diagonal Weingarten (ODW) lossβ€”the first such time-varying weight scheme for CAD reconstruction. Leveraging constant, linear, quintic, and stepwise interpolation strategies, ODW regularization is strengthened initially to stabilize gradient-based optimization and gradually decayed later to preserve high-frequency surface details. Integrated into the FlatCAD framework, our method retains its joint gradient-and-curvature regularization structure. On the ABC dataset, our approach reduces Chamfer distance by up to 35% over FlatCAD, significantly improving reconstruction accuracy and structural fidelity. This demonstrates both the effectiveness and necessity of dynamically scheduled curvature priors in neural SDF-based CAD reconstruction.

Technology Category

Application Category

πŸ“ Abstract
Neural signed distance functions (SDFs) have become a powerful representation for geometric reconstruction from point clouds, yet they often require both gradient- and curvature-based regularization to suppress spurious warp and preserve structural fidelity. FlatCAD introduced the Off-Diagonal Weingarten (ODW) loss as an efficient second-order prior for CAD surfaces, approximating full-Hessian regularization at roughly half the computational cost. However, FlatCAD applies a fixed ODW weight throughout training, which is suboptimal: strong regularization stabilizes early optimization but suppresses detail recovery in later stages. We present scheduling strategies for the ODW loss that assign a high initial weight to stabilize optimization and progressively decay it to permit fine-scale refinement. We investigate constant, linear, quintic, and step interpolation schedules, as well as an increasing warm-up variant. Experiments on the ABC CAD dataset demonstrate that time-varying schedules consistently outperform fixed weights. Our method achieves up to a 35% improvement in Chamfer Distance over the FlatCAD baseline, establishing scheduling as a simple yet effective extension of curvature regularization for robust CAD reconstruction.
Problem

Research questions and friction points this paper is trying to address.

Optimizing curvature regularization schedules for neural SDF training
Balancing early stabilization with detail recovery in CAD reconstruction
Improving geometric accuracy through adaptive Weingarten loss scheduling
Innovation

Methods, ideas, or system contributions that make the work stand out.

Scheduling Off-Diagonal Weingarten loss dynamically
Using time-varying weight schedules for optimization stability
Progressively decaying regularization to recover fine details
πŸ”Ž Similar Papers
No similar papers found.