🤖 AI Summary
Addressing the challenges of large garment deformations, dynamic human limb motion, visual occlusion, and imprecise force control in robot-assisted dressing, this paper proposes a vision–tactile multimodal closed-loop control framework. Methodologically, it employs a simulation-pretrained vision policy, augmented with online force-feedback fine-tuning and partial-observability-aware state estimation, enabling adaptive responses to user arm motion—thereby overcoming conventional static-limb assumptions. Evaluated on 264 long-sleeve dressing trials across 12 human subjects, the system achieves a 96.2% task completion rate—significantly outperforming baseline approaches—while improving user comfort and safety scores by 32%. To our knowledge, this is the first work to integrate a lightweight, multimodal online fine-tuning mechanism into real-world dressing tasks, establishing a robust, low-data-dependency paradigm for dynamic human–robot interaction in wearable assistive robotics.
📝 Abstract
Robot-assisted dressing has the potential to significantly improve the lives of individuals with mobility impairments. To ensure an effective and comfortable dressing experience, the robot must be able to handle challenging deformable garments, apply appropriate forces, and adapt to limb movements throughout the dressing process. Prior work often makes simplifying assumptions -- such as static human limbs during dressing -- which limits real-world applicability. In this work, we develop a robot-assisted dressing system capable of handling partial observations with visual occlusions, as well as robustly adapting to arm motions during the dressing process. Given a policy trained in simulation with partial observations, we propose a method to fine-tune it in the real world using a small amount of data and multi-modal feedback from vision and force sensing, to further improve the policy's adaptability to arm motions and enhance safety. We evaluate our method in simulation with simplified articulated human meshes and in a real world human study with 12 participants across 264 dressing trials. Our policy successfully dresses two long-sleeve everyday garments onto the participants while being adaptive to various kinds of arm motions, and greatly outperforms prior baselines in terms of task completion and user feedback. Video are available at https://dressing-motion.github.io/.