A Neural Tension Operator for Curve Subdivision across Constant Curvature Geometries

📅 2026-03-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes the first unified subdivision framework capable of operating across constant-curvature spaces—Euclidean, spherical, and hyperbolic—addressing the limitations of traditional interpolation-based subdivision schemes that rely on a single global tension parameter and struggle to handle diverse geometries cohesively. The method employs a lightweight neural network (140K parameters) to learn local intrinsic features and a learnable geometric embedding, enabling per-edge prediction of structurally safe insertion angles. It introduces a constrained Sigmoid output head and a Riemannian manifold-aware subdivision operator, supported by theoretical convergence guarantees and an adaptive tension heuristic. Experiments demonstrate consistent superiority over baselines across 240 validation curves, achieving lower bending energy and angular roughness; notably, on out-of-distribution orbital trajectories, it reduces bending energy by 41% and angular roughness by 68%.
📝 Abstract
Interpolatory subdivision schemes generate smooth curves from piecewise-linear control polygons by repeatedly inserting new vertices. Classical schemes rely on a single global tension parameter and typically require separate formulations in Euclidean, spherical, and hyperbolic geometries. We introduce a shared learned tension predictor that replaces the global parameter with per-edge insertion angles predicted by a single 140K-parameter network. The network takes local intrinsic features and a trainable geometry embedding as input, and the predicted angles drive geometry-specific insertion operators across all three spaces without architectural modification. A constrained sigmoid output head enforces a structural safety bound, guaranteeing that every inserted vertex lies within a valid angular range for any finite weight configuration. Three theoretical results accompany the method: a structural guarantee of tangent-safe insertions; a heuristic motivation for per-edge adaptivity; and a conditional convergence certificate for continuously differentiable limit curves, subject to an explicit Lipschitz constraint verified post hoc. On 240 held-out validation curves, the learned predictor occupies a distinct position on the fidelity--smoothness Pareto frontier, achieving markedly lower bending energy and angular roughness than all fixed-tension and manifold-lift baselines. Riemannian manifold lifts retain a pointwise-fidelity advantage, which this study quantifies directly. On the out-of-distribution ISS orbital ground-track example, bending energy falls by 41% and angular roughness by 68% with only a modest increase in Hausdorff distance, suggesting that the predictor generalises beyond its synthetic training distribution.
Problem

Research questions and friction points this paper is trying to address.

subdivision
constant curvature geometry
tension parameter
interpolatory curves
Riemannian manifolds
Innovation

Methods, ideas, or system contributions that make the work stand out.

neural tension operator
interpolatory subdivision
constant curvature geometry
learned geometric predictor
Riemannian manifold
🔎 Similar Papers
No similar papers found.