LAMP: Data-Efficient Linear Affine Weight-Space Models for Parameter-Controlled 3D Shape Generation and Extrapolation

📅 2025-10-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the strong data dependency, poor controllability, and limited generalization in parametric 3D shape generation, this paper proposes a data-efficient, controllable, and interpretable generative framework. Methodologically, it performs linear affine mixing in an aligned neural weight space, integrates SDF decoder overfitting with parameter-constrained optimization, and introduces a linear mismatch safety metric to ensure geometric validity. The approach achieves high-fidelity interpolation and safe extrapolation using only ~100 training samples—enabling, for the first time, full-range (100%) parameter-space extrapolation. Experiments on DrivAerNet++ and BlendedNet demonstrate substantial improvements over conditional autoencoders and Deep Neural Interpolation (DNI), particularly in data efficiency, precise parametric control, and physics-informed performance optimization.

Technology Category

Application Category

📝 Abstract
Generating high-fidelity 3D geometries that satisfy specific parameter constraints has broad applications in design and engineering. However, current methods typically rely on large training datasets and struggle with controllability and generalization beyond the training distributions. To overcome these limitations, we introduce LAMP (Linear Affine Mixing of Parametric shapes), a data-efficient framework for controllable and interpretable 3D generation. LAMP first aligns signed distance function (SDF) decoders by overfitting each exemplar from a shared initialization, then synthesizes new geometries by solving a parameter-constrained mixing problem in the aligned weight space. To ensure robustness, we further propose a safety metric that detects geometry validity via linearity mismatch. We evaluate LAMP on two 3D parametric benchmarks: DrivAerNet++ and BlendedNet. We found that LAMP enables (i) controlled interpolation within bounds with as few as 100 samples, (ii) safe extrapolation by up to 100% parameter difference beyond training ranges, (iii) physics performance-guided optimization under fixed parameters. LAMP significantly outperforms conditional autoencoder and Deep Network Interpolation (DNI) baselines in both extrapolation and data efficiency. Our results demonstrate that LAMP advances controllable, data-efficient, and safe 3D generation for design exploration, dataset generation, and performance-driven optimization.
Problem

Research questions and friction points this paper is trying to address.

Generating high-fidelity 3D shapes with parameter constraints
Overcoming data inefficiency and limited generalization in 3D generation
Enabling controlled interpolation and safe extrapolation beyond training ranges
Innovation

Methods, ideas, or system contributions that make the work stand out.

Linear affine mixing in aligned weight space
Data-efficient framework with few samples
Safety metric for geometry validity detection
🔎 Similar Papers
No similar papers found.