ProGait: A Multi-Purpose Video Dataset and Benchmark for Transfemoral Prosthesis Users

📅 2025-07-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Current vision models exhibit suboptimal performance in gait analysis for above-knee prosthetic users, primarily due to high visual heterogeneity of prostheses and distinctive biomechanical motion patterns. To address this gap, we introduce ProGait—the first multi-task video benchmark dataset specifically designed for prosthetic gait analysis—comprising 412 walking videos with fine-grained annotations including human silhouettes, 2D joint poses, and temporal gait phases. Leveraging ProGait, we systematically evaluate the limitations of state-of-the-art vision models and propose a prosthesis-aware fine-tuning strategy that significantly improves generalization across video segmentation, pose estimation, and gait analysis tasks. ProGait fills a critical void in high-quality, task-specific benchmarks for prosthetic vision understanding. The publicly released dataset and codebase support research in personalized prosthesis adaptation and intelligent rehabilitation assessment, thereby advancing reproducible and scalable AI-driven prosthetic research.

Technology Category

Application Category

📝 Abstract
Prosthetic legs play a pivotal role in clinical rehabilitation, allowing individuals with lower-limb amputations the ability to regain mobility and improve their quality of life. Gait analysis is fundamental for optimizing prosthesis design and alignment, directly impacting the mobility and life quality of individuals with lower-limb amputations. Vision-based machine learning (ML) methods offer a scalable and non-invasive solution to gait analysis, but face challenges in correctly detecting and analyzing prosthesis, due to their unique appearances and new movement patterns. In this paper, we aim to bridge this gap by introducing a multi-purpose dataset, namely ProGait, to support multiple vision tasks including Video Object Segmentation, 2D Human Pose Estimation, and Gait Analysis (GA). ProGait provides 412 video clips from four above-knee amputees when testing multiple newly-fitted prosthetic legs through walking trials, and depicts the presence, contours, poses, and gait patterns of human subjects with transfemoral prosthetic legs. Alongside the dataset itself, we also present benchmark tasks and fine-tuned baseline models to illustrate the practical application and performance of the ProGait dataset. We compared our baseline models against pre-trained vision models, demonstrating improved generalizability when applying the ProGait dataset for prosthesis-specific tasks. Our code is available at https://github.com/pittisl/ProGait and dataset at https://huggingface.co/datasets/ericyxy98/ProGait.
Problem

Research questions and friction points this paper is trying to address.

Lack of dataset for gait analysis in transfemoral prosthesis users
Challenges in vision-based detection of prosthetic legs
Need for improved ML models for prosthesis-specific tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Introduces ProGait dataset for prosthesis gait analysis
Uses vision-based ML for non-invasive gait assessment
Provides benchmark tasks and fine-tuned baseline models
🔎 Similar Papers
No similar papers found.
X
Xiangyu Yin
University of Pittsburgh
Boyuan Yang
Boyuan Yang
University of Pittsburgh
Weichen Liu
Weichen Liu
College of Computing and Data Science, Nanyang Technological University
Embedded SystemsMultiprocessor SystemsNetwork-on-Chip
Q
Qiyao Xue
University of Pittsburgh
A
Abrar Alamri
University of Pittsburgh
G
Goeran Fiedler
University of Pittsburgh
W
Wei Gao
University of Pittsburgh