CLAMP: Crowdsourcing a LArge-scale in-the-wild haptic dataset with an open-source device for Multimodal robot Perception

📅 2025-05-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
解决机器人非结构化环境中物体材质识别难题,提出低成本开源触觉设备CLAMP收集大规模触觉数据集,训练多模态触觉编码器实现材质识别,并验证其在机器人分拣等任务中的有效性。

Technology Category

Application Category

📝 Abstract
Robust robot manipulation in unstructured environments often requires understanding object properties that extend beyond geometry, such as material or compliance-properties that can be challenging to infer using vision alone. Multimodal haptic sensing provides a promising avenue for inferring such properties, yet progress has been constrained by the lack of large, diverse, and realistic haptic datasets. In this work, we introduce the CLAMP device, a low-cost (<$200) sensorized reacher-grabber designed to collect large-scale, in-the-wild multimodal haptic data from non-expert users in everyday settings. We deployed 16 CLAMP devices to 41 participants, resulting in the CLAMP dataset, the largest open-source multimodal haptic dataset to date, comprising 12.3 million datapoints across 5357 household objects. Using this dataset, we train a haptic encoder that can infer material and compliance object properties from multimodal haptic data. We leverage this encoder to create the CLAMP model, a visuo-haptic perception model for material recognition that generalizes to novel objects and three robot embodiments with minimal finetuning. We also demonstrate the effectiveness of our model in three real-world robot manipulation tasks: sorting recyclable and non-recyclable waste, retrieving objects from a cluttered bag, and distinguishing overripe from ripe bananas. Our results show that large-scale, in-the-wild haptic data collection can unlock new capabilities for generalizable robot manipulation. Website: https://emprise.cs.cornell.edu/clamp/
Problem

Research questions and friction points this paper is trying to address.

Lack of large diverse realistic haptic datasets for robots
Need for low-cost device collecting in-the-wild haptic data
Challenges in inferring material compliance via vision alone
Innovation

Methods, ideas, or system contributions that make the work stand out.

Low-cost haptic sensor for data collection
Large-scale multimodal haptic dataset creation
Visuo-haptic model for material recognition
🔎 Similar Papers
No similar papers found.
P
Pranav N. Thakkar
Cornell University
S
Shubhangi Sinha
Cornell University
K
Karan Baijal
Cornell University
Y
Yuhan Bian
Cornell University
L
Leah Lackey
Cornell University
B
Ben Dodson
Cornell University
H
Heisen Kong
Cornell University
Jueun Kwon
Jueun Kwon
PhD student, Northwestern University
roboticsembodied learningactive learningcontrol
A
Amber Li
Cornell University
Y
Yifei Hu
Cornell University
A
Alexios Rekoutis
Horace Mann School
Tom Silver
Tom Silver
Assistant Professor at Princeton
PlanningLearningRobotics
Tapomayukh Bhattacharjee
Tapomayukh Bhattacharjee
Assistant Professor @ Cornell CS
Human-Robot InteractionHaptic PerceptionRobot ManipulationAssistive Robotics