Intuitive control of supernumerary robotic limbs through a tactile-encoded neural interface

📅 2025-11-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the critical challenge of achieving intuitive, multi-degree-of-freedom brain–computer interface (BCI) control for supernumerary robotic limbs (SRLs) during natural movement. We propose a novel tactile P300–based BCI paradigm: spatially encoded vibrotactile stimuli elicit discriminable P300 responses, decoded in real time using a lightweight machine learning model. Unlike conventional paradigms, our approach requires neither motor imagery nor visual attention, enabling fully parallel, non-interfering control synchronized with natural limb motion. After only three days of training, users stably achieved 4-DOF command classification with >92% accuracy in dual-task scenarios—simultaneously operating two SRLs to assist bimanual manipulation. Our key contribution is the first demonstration of tactile P300 for high-dimensional SRL control, uniquely balancing robustness, low cognitive load, and seamless compatibility with unconstrained natural movement.

Technology Category

Application Category

📝 Abstract
Brain-computer interfaces (BCIs) promise to extend human movement capabilities by enabling direct neural control of supernumerary effectors, yet integrating augmented commands with multiple degrees of freedom without disrupting natural movement remains a key challenge. Here, we propose a tactile-encoded BCI that leverages sensory afferents through a novel tactile-evoked P300 paradigm, allowing intuitive and reliable decoding of supernumerary motor intentions even when superimposed with voluntary actions. The interface was evaluated in a multi-day experiment comprising of a single motor recognition task to validate baseline BCI performance and a dual task paradigm to assess the potential influence between the BCI and natural human movement. The brain interface achieved real-time and reliable decoding of four supernumerary degrees of freedom, with significant performance improvements after only three days of training. Importantly, after training, performance did not differ significantly between the single- and dual-BCI task conditions, and natural movement remained unimpaired during concurrent supernumerary control. Lastly, the interface was deployed in a movement augmentation task, demonstrating its ability to command two supernumerary robotic arms for functional assistance during bimanual tasks. These results establish a new neural interface paradigm for movement augmentation through stimulation of sensory afferents, expanding motor degrees of freedom without impairing natural movement.
Problem

Research questions and friction points this paper is trying to address.

Enabling intuitive neural control of supernumerary robotic limbs
Integrating augmented commands without disrupting natural movement
Decoding motor intentions reliably during voluntary human actions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Tactile-encoded BCI using sensory afferents
Novel tactile-evoked P300 paradigm for decoding
Real-time control of four robotic arm freedoms
🔎 Similar Papers
No similar papers found.
Tianyu Jia
Tianyu Jia
Assistant Professor, Peking University
VLSI DesignComputer Architecture
X
Xingchen Yang
Department of Bioengineering, Imperial College London, London, UK
C
Ciarán McGeady
Department of Bioengineering, Imperial College London, London, UK
Yifeng Li
Yifeng Li
Department of Bioengineering, Imperial College London, London, UK
J
Jinzhi Lin
School of Biomedical Engineering, Tsinghua University, Beijing, China
K
Kit San Ho
Department of Mechanical Engineering, Tsinghua University, Beijing, China
F
Feiyu Pan
Department of Mechanical Engineering, Tsinghua University, Beijing, China
L
Linhong Ji
Department of Mechanical Engineering, Tsinghua University, Beijing, China
C
Chong Li
School of Clinical Medicine (BTCH), Tsinghua Medicine, Tsinghua University, Beijing, China
D
Dario Farina
Department of Bioengineering, Imperial College London, London, UK