Lang2Morph: Language-Driven Morphological Design of Robotic Hands

📅 2025-09-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Dexterous hand morphology design relies heavily on expert intuition, incurs high computational costs in simulation-based optimization, and lacks semantic understanding of manipulation tasks. Method: This paper introduces the first zero-shot, large language model (LLM)-driven framework for dexterous hand structural synthesis. It directly parses natural language task descriptions, generates symbolic hand structures, and maps them to Open-Source Prosthetic Hand (OPH)-compatible parameters. The framework jointly optimizes semantic alignment and dimensional compatibility, and supports LLM-guided iterative refinement—eliminating the need for simulation or manual parameter tuning. Results: Experiments demonstrate that the framework automatically generates 3D-printable, functionally appropriate, and morphologically diverse dexterous hands across multiple manipulation tasks. It significantly improves design efficiency and generalization capability, enabling end-to-end, semantics-to-manufacturable-structure mapping without human intervention.

Technology Category

Application Category

📝 Abstract
Designing robotic hand morphologies for diverse manipulation tasks requires balancing dexterity, manufacturability, and task-specific functionality. While open-source frameworks and parametric tools support reproducible design, they still rely on expert heuristics and manual tuning. Automated methods using optimization are often compute-intensive, simulation-dependent, and rarely target dexterous hands. Large language models (LLMs), with their broad knowledge of human-object interactions and strong generative capabilities, offer a promising alternative for zero-shot design reasoning. In this paper, we present Lang2Morph, a language-driven pipeline for robotic hand design. It uses LLMs to translate natural-language task descriptions into symbolic structures and OPH-compatible parameters, enabling 3D-printable task-specific morphologies. The pipeline consists of: (i) Morphology Design, which maps tasks into semantic tags, structural grammars, and OPH-compatible parameters; and (ii) Selection and Refinement, which evaluates design candidates based on semantic alignment and size compatibility, and optionally applies LLM-guided refinement when needed. We evaluate Lang2Morph across varied tasks, and results show that our approach can generate diverse, task-relevant morphologies. To our knowledge, this is the first attempt to develop an LLM-based framework for task-conditioned robotic hand design.
Problem

Research questions and friction points this paper is trying to address.

Automating robotic hand design for diverse manipulation tasks
Reducing reliance on expert heuristics and manual tuning
Overcoming compute-intensive optimization methods for dexterous hands
Innovation

Methods, ideas, or system contributions that make the work stand out.

LLMs translate task descriptions into symbolic structures
Generates 3D-printable parameters for task-specific morphologies
LLM-guided refinement evaluates semantic alignment and compatibility
🔎 Similar Papers
No similar papers found.
Yanyuan Qiao
Yanyuan Qiao
Postdoctoral Research Fellow, EPFL
Embodied-AIVision and LanguageMulti-modal Learning
Kieran Gilday
Kieran Gilday
EPFL
roboticsmanipulationsoft robotics
Y
Yutong Xie
Computer Vision Department, Mohamed bin Zayed University of Artificial Intelligence (MBZUAI), Abu Dhabi, UAE
J
Josie Hughes
CREATE Lab, Swiss Federal Institute of Technology Lausanne (EPFL), Lausanne, Switzerland