🤖 AI Summary
Existing surgical role modeling approaches neglect individual characteristics—such as kinematic patterns and anthropometric traits—as well as team familiarity, and lack robustness for longitudinal personnel tracking across clinical sites.
Method: We propose a generalizable, transferable re-identification framework for surgical staff, the first to jointly model 3D point-cloud-based static morphology and dynamic joint motion—enabling markerless, cross-operating-room long-term tracking. Our method integrates temporal pose representation learning, cross-domain adaptive feature alignment, and multi-view consistency constraints, transcending conventional role-level abstraction.
Results: Evaluated on real-world clinical data, our framework achieves 86.19% identification accuracy and 75.27% cross-environment transfer accuracy—surpassing state-of-the-art by 12%—while improving tracking precision by over 50%. It enables fine-grained, spatiotemporal visualization of surgical team dynamics and operating room utilization.
📝 Abstract
Surgical domain models improve workflow optimization through automated predictions of each staff member's surgical role. However, mounting evidence indicates that team familiarity and individuality impact surgical outcomes. We present a novel staff-centric modeling approach that characterizes individual team members through their distinctive movement patterns and physical characteristics, enabling long-term tracking and analysis of surgical personnel across multiple procedures. To address the challenge of inter-clinic variability, we develop a generalizable re-identification framework that encodes sequences of 3D point clouds to capture shape and articulated motion patterns unique to each individual. Our method achieves 86.19% accuracy on realistic clinical data while maintaining 75.27% accuracy when transferring between different environments - a 12% improvement over existing methods. When used to augment markerless personnel tracking, our approach improves accuracy by over 50%. Through extensive validation across three datasets and the introduction of a novel workflow visualization technique, we demonstrate how our framework can reveal novel insights into surgical team dynamics and space utilization patterns, advancing methods to analyze surgical workflows and team coordination.