RETRO: REthinking Tactile Representation Learning with Material PriOrs

📅 2025-05-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing tactile representation learning methods primarily focus on tactile–visual or tactile–text alignment, overlooking the fundamental influence of surface material properties on tactile perception—leading to models lacking physical consistency and cross-material generalizability. This work introduces, for the first time, a systematic integration of material physics priors into tactile representation learning. We propose a three-stage framework: (i) material prior embedding, (ii) multimodal contrastive learning, and (iii) joint tactile–material representation learning. Our approach significantly enhances disentangled modeling of surface texture and intrinsic material properties. It achieves state-of-the-art performance across tactile recognition, material classification, and robot closed-loop manipulation tasks. Extensive evaluation on real robotic platforms demonstrates strong generalization and robustness across diverse materials and contact conditions. This work establishes a new paradigm for physics-grounded perceptual modeling in embodied intelligence.

Technology Category

Application Category

📝 Abstract
Tactile perception is profoundly influenced by the surface properties of objects in contact. However, despite their crucial role in shaping tactile experiences, these material characteristics have been largely neglected in existing tactile representation learning methods. Most approaches primarily focus on aligning tactile data with visual or textual information, overlooking the richness of tactile feedback that comes from understanding the materials' inherent properties. In this work, we address this gap by revisiting the tactile representation learning framework and incorporating material-aware priors into the learning process. These priors, which represent pre-learned characteristics specific to different materials, allow tactile models to better capture and generalize the nuances of surface texture. Our method enables more accurate, contextually rich tactile feedback across diverse materials and textures, improving performance in real-world applications such as robotics, haptic feedback systems, and material editing.
Problem

Research questions and friction points this paper is trying to address.

Neglect of material characteristics in tactile representation learning
Overlooking tactile feedback from materials' inherent properties
Need for material-aware priors in tactile models
Innovation

Methods, ideas, or system contributions that make the work stand out.

Incorporates material-aware priors into learning
Enhances tactile feedback with material properties
Improves accuracy across diverse textures
🔎 Similar Papers
No similar papers found.
W
Weihao Xia
University College London
Chenliang Zhou
Chenliang Zhou
University of Cambridge
machine learninggenerative artificial intelligencecomputer visioncomputer graphics
C
Cengiz Oztireli
University of Cambridge