🤖 AI Summary
This work addresses key challenges in interactive multi-object 3D Gaussian splatting—namely, inaccurate object segmentation, inter-material deformation artifacts, and rendering anomalies (e.g., interpenetration and floating). We propose the first physics-aware, multi-material 3D Gaussian simulation framework tailored for interaction. Methodologically: (i) we introduce a pixel-to-Gaussian fast mapping mechanism to achieve precise 3D object segmentation; (ii) we jointly model multi-material physical properties as a differentiable dynamical system; and (iii) we pioneer embedding physical constraints directly into the Gaussian deformation gradient, explicitly regularizing scaling and rotation to ensure geometric fidelity and visual consistency. Evaluated on interactive scene reconstruction, our method significantly outperforms state-of-the-art approaches, effectively suppressing rendering artifacts while improving geometric accuracy and visual realism. This advances a new paradigm for physically grounded, dynamic 3D scene generation.
📝 Abstract
3D Gaussian Splatting has achieved remarkable success in reconstructing both static and dynamic 3D scenes. However, in a scene represented by 3D Gaussian primitives, interactions between objects suffer from inaccurate 3D segmentation, imprecise deformation among different materials, and severe rendering artifacts. To address these challenges, we introduce PIG: Physically-Based Multi-Material Interaction with 3D Gaussians, a novel approach that combines 3D object segmentation with the simulation of interacting objects in high precision. Firstly, our method facilitates fast and accurate mapping from 2D pixels to 3D Gaussians, enabling precise 3D object-level segmentation. Secondly, we assign unique physical properties to correspondingly segmented objects within the scene for multi-material coupled interactions. Finally, we have successfully embedded constraint scales into deformation gradients, specifically clamping the scaling and rotation properties of the Gaussian primitives to eliminate artifacts and achieve geometric fidelity and visual consistency. Experimental results demonstrate that our method not only outperforms the state-of-the-art (SOTA) in terms of visual quality, but also opens up new directions and pipelines for the field of physically realistic scene generation.