🤖 AI Summary
This work addresses the performance limitations of first-person hand-object interaction detection caused by the scarcity of real-world annotations. To overcome this challenge, the authors propose the first synthetic data generation pipeline specifically designed for hand-object interactions, along with the HOI-Synth benchmark, which automatically provides contact states, bounding boxes, and pixel-level segmentation masks. By systematically aligning synthetic and real data at the levels of object appearance, grasp types, and scene context—and jointly training with only 10% of real data—the method achieves significant improvements in overall AP: +5.67% on VISOR, +8.24% on EgoHOS, and +11.69% on ENIGMA-51. These results demonstrate the critical role of high-quality synthetic data and effective domain alignment in advancing egocentric hand-object interaction understanding.
📝 Abstract
In this work, we explore the role of synthetic data in improving the detection of Hand-Object Interactions from egocentric images. Through extensive experimentation and comparative analysis on VISOR, EgoHOS, and ENIGMA-51 datasets, our findings demonstrate the potential of synthetic data to significantly improve HOI detection, particularly when real labeled data are scarce or unavailable. By using synthetic data and only 10% of the real labeled data, we achieve improvements in Overall AP over models trained exclusively on real data, with gains of +5.67% on VISOR, +8.24% on EgoHOS, and +11.69% on ENIGMA-51. Furthermore, we systematically study how aligning synthetic data to specific real-world benchmarks with respect to objects, grasps, and environments, showing that the effectiveness of synthetic data consistently improves with better synthetic-real alignment. As a result of this work, we release a new data generation pipeline and the new HOI-Synth benchmark, which augments existing datasets with synthetic images of hand-object interaction. These data are automatically annotated with hand-object contact states, bounding boxes, and pixel-wise segmentation masks. All data, code, and tools for synthetic data generation are available at: https://fpv-iplab.github.io/HOI-Synth/.