A Multi-Layer Sim-to-Real Framework for Gaze-Driven Assistive Neck Exoskeletons

πŸ“… 2026-03-06
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This study addresses the loss of voluntary head control in patients with dropped-head syndrome due to neck muscle weakness by proposing a personalized assistive control method that predicts head motion from eye movements. Leveraging eye–head coupling data collected from healthy subjects in a virtual reality environment, the authors develop an eye-movement-driven head pose prediction model and establish a multi-stage Sim-to-Real controller selection framework spanning simulation, VR, and a physical neck exoskeleton. Experimental validation on the physical exoskeleton demonstrates the efficacy of multiple high-performance eye-controlled models, leading to the identification of two novel architectures. The results confirm that no single controller universally outperforms others, thereby underscoring both the necessity and feasibility of personalized control strategies for assistive head support.

Technology Category

Application Category

πŸ“ Abstract
Dropped head syndrome, caused by neck muscle weakness from neurological diseases, severely impairs an individual's ability to support and move their head, causing pain and making everyday tasks challenging. Our long-term goal is to develop an assistive powered neck exoskeleton that restores natural movement. However, predicting a user's intended head movement remains a key challenge. We leverage virtual reality (VR) to collect coupled eye and head movement data from healthy individuals to train models capable of predicting head movement based solely on eye gaze. We also propose a novel multi-layer controller selection framework, where head control strategies are evaluated across decreasing levels of abstraction -- from simulation and VR to a physical neck exoskeleton. This pipeline effectively rejects poor-performing controllers early, identifying two novel gaze-driven models that achieve strong performance when deployed on the physical exoskeleton. Our results reveal that no single controller is universally preferred, highlighting the necessity for personalization in gaze-driven assistive control. Our work demonstrates the utility of VR-based evaluation for accelerating the development of intuitive, safe, and personalized assistive robots.
Problem

Research questions and friction points this paper is trying to address.

gaze-driven control
neck exoskeleton
intention prediction
assistive robotics
dropped head syndrome
Innovation

Methods, ideas, or system contributions that make the work stand out.

gaze-driven control
sim-to-real transfer
neck exoskeleton
virtual reality
personalized assistive robotics
πŸ”Ž Similar Papers
No similar papers found.