ETCH: Generalizing Body Fitting to Clothed Humans via Equivariant Tightness

📅 2025-03-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Generic body fitting from point clouds of humans wearing loose clothing remains challenging—traditional methods rely heavily on pose initialization, while learning-based approaches suffer from poor generalization. Method: We propose an SE(3)-equivariant tightness modeling framework. Its core innovation is encoding cloth-to-body displacements as pose-invariant tightness representations, thereby decoupling complex cloth-to-body fitting into a sparse intra-body landmark regression task. We introduce local SE(3)-equivariant displacement encoding, tightness-driven mapping, and pose-invariant feature extraction. Results: On CAPE and 4D-Dress, our method reduces body fitting error by 16.7%–69.5% and improves shape accuracy by 49.9% on average. Under out-of-distribution (OOD) scenarios—including unseen poses, body shapes, and non-rigid dynamics—orientation error decreases by 67.2%–89.8%, demonstrating significantly enhanced generalization capability.

Technology Category

Application Category

📝 Abstract
Fitting a body to a 3D clothed human point cloud is a common yet challenging task. Traditional optimization-based approaches use multi-stage pipelines that are sensitive to pose initialization, while recent learning-based methods often struggle with generalization across diverse poses and garment types. We propose Equivariant Tightness Fitting for Clothed Humans, or ETCH, a novel pipeline that estimates cloth-to-body surface mapping through locally approximate SE(3) equivariance, encoding tightness as displacement vectors from the cloth surface to the underlying body. Following this mapping, pose-invariant body features regress sparse body markers, simplifying clothed human fitting into an inner-body marker fitting task. Extensive experiments on CAPE and 4D-Dress show that ETCH significantly outperforms state-of-the-art methods -- both tightness-agnostic and tightness-aware -- in body fitting accuracy on loose clothing (16.7% ~ 69.5%) and shape accuracy (average 49.9%). Our equivariant tightness design can even reduce directional errors by (67.2% ~ 89.8%) in one-shot (or out-of-distribution) settings. Qualitative results demonstrate strong generalization of ETCH, regardless of challenging poses, unseen shapes, loose clothing, and non-rigid dynamics. We will release the code and models soon for research purposes at https://boqian-li.github.io/ETCH/.
Problem

Research questions and friction points this paper is trying to address.

Challenges in fitting 3D clothed human point clouds
Generalization issues across poses and garment types
Improving accuracy in body fitting for loose clothing
Innovation

Methods, ideas, or system contributions that make the work stand out.

Equivariant Tightness Fitting for Clothed Humans
Locally approximate SE(3) equivariance mapping
Pose-invariant body features regression
🔎 Similar Papers
No similar papers found.