TASC: Task-Aware Shared Control for Teleoperated Manipulation

πŸ“… 2025-09-12
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses two key challenges in teleoperation: difficulty in inferring user task intent and weak cross-task generalization of assistance. To this end, we propose TASCβ€”a Task-Aware zero-shot Shared Control framework. TASC leverages vision-language models to construct open-vocabulary interaction graphs, enabling task-level intent inference without predefined ontologies or domain-specific knowledge; it further integrates spatial constraint prediction to provide real-time rotational assistance during grasping and object manipulation. Its core contribution is the first realization of zero-shot generalizable shared control for everyday manipulation tasks, jointly supporting long-horizon intent modeling and motion assistance generalization across diverse objects and tasks. Extensive evaluations on both simulation and real-robot platforms demonstrate that TASC significantly improves task success rates and execution efficiency while reducing user cognitive load and physical operation effort.

Technology Category

Application Category

πŸ“ Abstract
We present TASC, a Task-Aware Shared Control framework for teleoperated manipulation that infers task-level user intent and provides assistance throughout the task. To support everyday tasks without predefined knowledge, TASC constructs an open-vocabulary interaction graph from visual input to represent functional object relationships, and infers user intent accordingly. A shared control policy then provides rotation assistance during both grasping and object interaction, guided by spatial constraints predicted by a vision-language model. Our method addresses two key challenges in general-purpose, long-horizon shared control: (1) understanding and inferring task-level user intent, and (2) generalizing assistance across diverse objects and tasks. Experiments in both simulation and the real world demonstrate that TASC improves task efficiency and reduces user input effort compared to prior methods. To the best of our knowledge, this is the first shared control framework that supports everyday manipulation tasks with zero-shot generalization. The code that supports our experiments is publicly available at https://github.com/fitz0401/tasc.
Problem

Research questions and friction points this paper is trying to address.

Inferring task-level user intent in teleoperation
Providing assistance across diverse objects and tasks
Enabling zero-shot generalization for manipulation tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Task-aware shared control framework
Open-vocabulary interaction graph construction
Vision-language model guided assistance
πŸ”Ž Similar Papers
No similar papers found.
Z
Ze Fu
KU Leuven, Dept. Mechanical Engineering, Research unit Robotics, Automation and Mechatronics, B-3000 Leuven, Belgium
Pinhao Song
Pinhao Song
KU Leuven | Peking University
NeuroboticsProbabilistic RoboticsUnderwater Object Detection
Y
Yutong Hu
KU Leuven, Dept. Mechanical Engineering, Research unit Robotics, Automation and Mechatronics, B-3000 Leuven, Belgium
Renaud Detry
Renaud Detry
KU Leuven
Robot LearningComputer VisionSpace Robotics