PhySkin: Physics-based Bone-driven Neural Garment Simulation

📅 2026-03-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of achieving real-time, physically plausible simulation of loose garments on resource-constrained devices, where traditional physics-based methods are computationally prohibitive and existing neural approaches struggle to balance efficiency with realism. The authors propose a skeleton-driven, lightweight neural simulation framework that maps garment degrees of freedom onto skeletal control handles, enabling fully self-supervised training of a neural deformation model. Requiring only body shape and pose as input, the method generates physically consistent garment deformations without relying on real or synthetic training data. It supports zero-shot generalization, handles arbitrary mesh topologies, and achieves microsecond-level inference on a single-threaded CPU. Experiments demonstrate its ability to produce realistic draping effects across diverse body shapes and poses, significantly outperforming state-of-the-art methods and enabling real-time virtual character animation on low-power hardware.
📝 Abstract
Recent advances in digital avatar technology have enabled the generation of compelling virtual characters, but deploying these avatars on compute-constrained devices poses significant challenges for achieving realistic garment deformations. While physics-based simulations yield accurate results, they are computationally prohibitive for real-time applications. Conversely, linear blend skinning offers efficiency but fails to capture the complex dynamics of loose-fitting garments, resulting in unrealistic motion and visual artifacts. Neural methods have shown promise, yet they struggle to animate loose clothing plausibly under strict performance constraints. In this work, we present a novel approach for fast and physically plausible garment draping tailored for resource-constrained environments. Our method leverages a reduced-space quasi-static neural simulation, mapping the garment's full degrees of freedom to a set of bone handles that drive deformation. A neural deformation model is trained in a fully self-supervised manner, eliminating the need for costly simulation data. At runtime, a lightweight neural network modulates the handle deformations based on body shape and pose, enabling realistic garment behavior that respects physical properties such as gravity, fabric stretching, bending, and collision avoidance. Experimental results demonstrate that our method achieves physically plausible garment drapes while generalizing across diverse poses and body shapes, supporting zero-shot evaluation and mesh topology independence. Our method's runtime significantly outperforms past works, as it runs in microseconds per frame using single-threaded CPU inference, offering a practical solution for real-time avatar animation on low-compute devices.
Problem

Research questions and friction points this paper is trying to address.

garment simulation
real-time animation
physics-based deformation
compute-constrained devices
loose-fitting clothing
Innovation

Methods, ideas, or system contributions that make the work stand out.

physics-based simulation
neural garment animation
bone-driven deformation
self-supervised learning
real-time avatar
🔎 Similar Papers
No similar papers found.