🤖 AI Summary
Addressing the challenges of scarce annotated data and severe simulation-to-reality domain shift in structural deformation detection under real-world conditions, this paper proposes a controllable physics-based simulation framework integrated with lightweight domain adaptation. Methodologically: (1) we design a physics-inspired, controllable deformation image generator tailored for deformation classification, enabling fine-grained adjustment of deformation parameters; (2) we introduce a feature-level adversarial domain adaptation network to align simulated feature distributions with those of real-world data; (3) the model achieves efficient fine-tuning using only a small number of real deformation samples. Experiments demonstrate that our approach improves classification accuracy by 12.6% over a pure-simulation baseline on cross-domain deformation classification tasks, significantly alleviating data dependency. The source code is publicly available.
📝 Abstract
Deformation detection is vital for enabling accurate assessment and prediction of structural changes in materials, ensuring timely and effective interventions to maintain safety and integrity. Automating deformation detection through computer vision is crucial for efficient monitoring, but it faces significant challenges in creating a comprehensive dataset of both deformed and non-deformed objects, which can be difficult to obtain in many scenarios. In this paper, we introduce a novel framework for generating controlled synthetic data that simulates deformed objects. This approach allows for the realistic modeling of object deformations under various conditions. Our framework integrates an intelligent adapter network that facilitates sim-to-real domain adaptation, enhancing classification results requiring limited real data from deformed objects. We conduct experiments on domain adaptation and classification tasks and demonstrate that our framework improves sim-to-real classification results compared to the simulation baseline. Our code is available here.