Multi-Sensor Fusion-Based Mobile Manipulator Remote Control for Intelligent Smart Home Assistance

📅 2025-04-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the insufficient naturalness and robustness of remote teleoperation for mobile manipulators in smart-home assistive scenarios for elderly and disabled users, this paper proposes a multimodal wearable-sensing–based remote collaborative control system. We innovatively design a lightweight forearm-worn architecture integrating MEMS capacitive microphones, IMUs, vibrotactile actuators, and pressure sensors to enable six-class gesture-force co-recognition. A CNN-LSTM temporal model combined with cross-modal (tactile/inertial/acoustic) synchronous fusion ensures real-time closed-loop motion control. Experimental results demonstrate offline and online gesture recognition accuracies of 88.33% and 83.33%, respectively; navigation-and-grasping success rate of 98% (with trajectory deviation of 3.6 cm); and end-to-end object transport success rate of 91.1%. The system significantly enhances the intuitiveness and reliability of human–robot interaction in assistive applications.

Technology Category

Application Category

📝 Abstract
This paper proposes a wearable-controlled mobile manipulator system for intelligent smart home assistance, integrating MEMS capacitive microphones, IMU sensors, vibration motors, and pressure feedback to enhance human-robot interaction. The wearable device captures forearm muscle activity and converts it into real-time control signals for mobile manipulation. The wearable device achieves an offline classification accuracy of 88.33% across six distinct movement-force classes for hand gestures by using a CNN-LSTM model, while real-world experiments involving five participants yield a practical accuracy of 83.33% with an average system response time of 1.2 seconds. In Human-Robot synergy in navigation and grasping tasks, the robot achieved a 98% task success rate with an average trajectory deviation of only 3.6 cm. Finally, the wearable-controlled mobile manipulator system achieved a 93.3% gripping success rate, a transfer success of 95.6%, and a full-task success rate of 91.1% during object grasping and transfer tests, in which a total of 9 object-texture combinations were evaluated. These three experiments' results validate the effectiveness of MEMS-based wearable sensing combined with multi-sensor fusion for reliable and intuitive control of assistive robots in smart home scenarios.
Problem

Research questions and friction points this paper is trying to address.

Develop wearable-controlled mobile manipulator for smart home assistance
Enhance human-robot interaction via multi-sensor fusion and real-time control
Validate system effectiveness in navigation, grasping, and object transfer tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Wearable device captures muscle activity for control
CNN-LSTM model achieves 88.33% gesture accuracy
Multi-sensor fusion ensures reliable robot assistance
Xiao Jin
Xiao Jin
CUHK
CV && RecSys
B
Bo Xiao
School of Robotics, Xi’an Jiaotong-Liverpool University (XJTLU), Suzhou, China, 215000
Huijiang Wang
Huijiang Wang
EPFL
Soft roboticsdexterous manipulationtactile sensingaerospace engineering
Wendong Wang
Wendong Wang
China University of Petroleum(East China)
Flow in Porous MediaCCUSUnconventional Resource Development
Z
Zhenhua Yu
Department of Computer Science, University of Aberdeen, AB24 3UE, Aberdeen, UK