Transparent Fragments Contour Estimation via Visual-Tactile Fusion for Autonomous Reassembly

๐Ÿ“… 2026-03-18
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work addresses the challenging problem of contour estimation for transparent fragments in autonomous reassembly, which is hindered by their optical complexity and irregular shapes. Inspired by human visuo-tactile perception, the study introduces a novel visualโ€“tactile fusion framework and presents TransFrag27K, the first large-scale dataset of transparent fragments. The proposed method employs TransFragNet to identify optimal grasp points and integrates tactile feedback from a GelSight Mini sensor to capture fine edge details. Fragment matching and reassembly are then achieved through a multidimensional similarity metric. Evaluated in real-world scenarios, the approach demonstrates superior performance, and the release of the dataset and code establishes a reproducible benchmark for transparent fragment reassembly research.

Technology Category

Application Category

๐Ÿ“ Abstract
The contour estimation of transparent fragments is very important for autonomous reassembly, especially in the fields of precision optical instrument repair, cultural relic restoration, and identification of other precious device broken accidents. Different from general intact transparent objects, the contour estimation of transparent fragments face greater challenges due to strict optical properties, irregular shapes and edges. To address this issue, a general transparent fragments contour estimation framework based on visual-tactile fusion is proposed in this paper. First, we construct the transparent fragment dataset named TransFrag27K, which includes a multiscene synthetic data of broken fragments from multiple types of transparent objects, and a scalable synthetic data generation pipeline. Secondly, we propose a visual grasping position detection network named TransFragNet to identify, locate and segment the sampling grasping position. And, we use a two-finger gripper with Gelsight Mini sensors to obtain reconstructed tactile information of the lateral edge of the fragments. By fusing this tactile information with visual cues, a visual-tactile fusion material classifier is proposed. Inspired by the way humans estimate a fragment's contour combining vision and touch, we introduce a general transparent fragment contour estimation framework based on visual-tactile fusion, demonstrates strong performance in real-world validation. Finally, a multi-dimensional similarity metrics based contour matching and reassembly algorithm is proposed, providing a reproducible benchmark for evaluating visual-tactile contour estimation and fragment reassembly. The experimental results demonstrate the validity of the proposed framework. The dataset and codes are available at https://github.com/Keithllin/Transparent-Fragments-Contour-Estimation.
Problem

Research questions and friction points this paper is trying to address.

transparent fragments
contour estimation
visual-tactile fusion
autonomous reassembly
irregular shapes
Innovation

Methods, ideas, or system contributions that make the work stand out.

visual-tactile fusion
transparent fragment contour estimation
TransFragNet
GelSight tactile sensing
fragment reassembly
๐Ÿ”Ž Similar Papers
No similar papers found.
Q
Qihao Lin
School of Advanced Manufacturing, Sun Yat-sen University, Shenzhen 518107, China
B
Borui Chen
School of Advanced Manufacturing, Sun Yat-sen University, Shenzhen 518107, China
Y
Yuping Zhou
School of Advanced Manufacturing, Sun Yat-sen University, Shenzhen 518107, China
Jianing Wu
Jianing Wu
Sun Yat-Sen University
Reliability EngineeringBiophysicsBio-inspired Systems
Yulan Guo
Yulan Guo
Professor, Sun Yat-sen University
3D VisionMachine LearningRobotics
W
Weishi Zheng
School of Computer Science and Engineering, Sun Yat-sen University, Guangzhou 510006, China
C
Chongkun Xia
School of Advanced Manufacturing, Sun Yat-sen University, Shenzhen 518107, China