VersaPants: A Loose-Fitting Textile Capacitive Sensing System for Lower-Body Motion Capture

📅 2025-11-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of accurate, comfortable, and privacy-preserving lower-limb motion capture under loose-fitting clothing. We propose a non-contact textile capacitive sensing system: conductive fabric electrodes are integrated into relaxed-fit trousers, coupled with a low-power signal acquisition unit and a lightweight Transformer model to enable end-to-end, real-time mapping from capacitance measurements to joint angles. To our knowledge, this is the first capacitive textile-based lower-limb motion capture system that operates without subject-specific calibration while ensuring privacy (no cameras or biosignals), comfort (non-restrictive wear), and strong cross-subject generalizability. Evaluated on an 11-subject dataset, the system achieves mean joint position error of 11.96 cm and joint angle error of 12.3°. The model reduces parameter count by 22× versus baseline architectures and runs at 42 FPS—enabling deployment on resource-constrained edge devices such as smartwatches.

Technology Category

Application Category

📝 Abstract
We present VersaPants, the first loose-fitting, textile-based capacitive sensing system for lower-body motion capture, built on the open-hardware VersaSens platform. By integrating conductive textile patches and a compact acquisition unit into a pair of pants, the system reconstructs lower-body pose without compromising comfort. Unlike IMU-based systems that require user-specific fitting or camera-based methods that compromise privacy, our approach operates without fitting adjustments and preserves user privacy. VersaPants is a custom-designed smart garment featuring 6 capacitive channels per leg. We employ a lightweight Transformer-based deep learning model that maps capacitance signals to joint angles, enabling embedded implementation on edge platforms. To test our system, we collected approximately 3.7 hours of motion data from 11 participants performing 16 daily and exercise-based movements. The model achieves a mean per-joint position error (MPJPE) of 11.96 cm and a mean per-joint angle error (MPJAE) of 12.3 degrees across the hip, knee, and ankle joints, indicating the model's ability to generalize to unseen users and movements. A comparative analysis of existing textile-based deep learning architectures reveals that our model achieves competitive reconstruction performance with up to 22 times fewer parameters and 18 times fewer FLOPs, enabling real-time inference at 42 FPS on a commercial smartwatch without quantization. These results position VersaPants as a promising step toward scalable, comfortable, and embedded motion-capture solutions for fitness, healthcare, and wellbeing applications.
Problem

Research questions and friction points this paper is trying to address.

Developing loose-fitting textile pants for lower-body motion capture
Reconstructing body pose without compromising comfort or user privacy
Enabling real-time embedded motion capture with minimal computational resources
Innovation

Methods, ideas, or system contributions that make the work stand out.

Textile capacitive sensing system captures lower-body motion
Transformer model maps capacitance signals to joint angles
Lightweight model enables embedded real-time inference on edge
🔎 Similar Papers
No similar papers found.
D
Deniz Kasap
École Polytechnique Fédérale de Lausanne (EPFL), Switzerland
T
Taraneh Aminosharieh Najafi
École Polytechnique Fédérale de Lausanne (EPFL), Switzerland
J
Jérôme Paul Rémy Thevenot
École Polytechnique Fédérale de Lausanne (EPFL), Switzerland
Jonathan Dan
Jonathan Dan
Post-Doc, EPFL
EEGEpilepsySignal processingWearable EEG
Stefano Albini
Stefano Albini
École Polytechnique Fédérale de Lausanne (EPFL), Switzerland
David Atienza
David Atienza
Professor of Electrical and Computer Engineering, EPFL
Embedded systemsThermal managementHW/SW codesignEdge AIInternet of Things