A Convex-Inspired Neural Construction for Structured and Generalizable Nonlinear Model Reduction

📅 2025-11-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Balancing physical fidelity and interactive performance in real-time deformable object simulation remains challenging, as linear dimensionality reduction methods suffer from limited expressiveness, while nonlinear alternatives exhibit poor generalizability. To address this, we propose a convex-optimization-inspired symmetric neural network architecture—specifically, the first integration of input-convex neural networks (ICNNs) with explicit symmetry constraints—to construct a structured, physics-consistent nonlinear decoder for reduced-order modeling. This design enforces convexity and mechanical plausibility in the latent solution space, even under high compression ratios. Consequently, the model achieves superior generalization to unseen loading conditions—including varying force magnitudes/directions and sparse training data. Experiments demonstrate millisecond-level inference latency while producing physically realistic deformation behavior; our method outperforms state-of-the-art linear and nonlinear dimensionality reduction approaches in both real-time interaction quality and robustness.

Technology Category

Application Category

📝 Abstract
Real-time simulation of deformable objects relies on model reduction to achieve interactive performance while maintaining physical fidelity. Traditional linear methods, such as principal component analysis (PCA), provide structured and predictable behavior thanks to their linear formulation, but are limited in expressiveness. Nonlinear model reduction, typically implemented with neural networks, offers richer representations and higher compression; however, without structural constraints, the learned mappings often fail to generalize beyond the training distribution, leading to unstable or implausible deformations. We present a symmetric, convex-inspired neural formulation that bridges the gap between linear and nonlinear model reduction. Our approach adopts an input-convex neural network (ICNN) augmented with symmetry constraints to impose structure on the nonlinear decoder. This design retains the flexibility of neural mappings while embedding physical consistency, yielding coherent and stable displacements even under unseen conditions. We evaluate our method on challenging deformation scenarios involving forces of different magnitudes, inverse directions, and sparsely sampled training data. Our approach demonstrates superior generalization while maintaining compact reduced spaces, and supports real-time interactive applications.
Problem

Research questions and friction points this paper is trying to address.

Bridging linear and nonlinear model reduction for deformable object simulation
Ensuring generalization of neural networks beyond training distribution
Maintaining physical consistency and stability in real-time deformation scenarios
Innovation

Methods, ideas, or system contributions that make the work stand out.

Convex-inspired neural network for nonlinear model reduction
Symmetry constraints ensure physical consistency in decoder
Compact reduced spaces enable real-time interactive simulation
🔎 Similar Papers
No similar papers found.