🤖 AI Summary
Existing kinesthetic haptic interfaces suffer from limited workspace, low degrees of freedom (DOFs), and kinematic mismatch with the human arm, hindering immersive VR experiences. This paper proposes a large-scale, high-DOF biomimetic kinesthetic haptic interface whose mechanical architecture closely aligns with human upper-limb kinematics, enabling natural walking and omnidirectional force feedback. The core contribution is a novel admittance-based hierarchical force control framework that natively integrates joint-space and Cartesian-space constraint handling with singularity avoidance, enabling unified rendering of arbitrary serial kinematic chains and Cartesian admittance behaviors. Experiments demonstrate high-fidelity force rendering and stable interaction during complex dynamic tasks, significantly improving interaction naturalness and robustness. The approach establishes a scalable hardware–control co-design pathway for immersive VR systems.
📝 Abstract
Research in virtual reality and haptic technologies has consistently aimed to enhance immersion. While advanced head-mounted displays are now commercially available, kinesthetic haptic interfaces still face challenges such as limited workspaces, insufficient degrees of freedom, and kinematics not matching the human arm. In this paper, we present HapticGiant, a novel large-scale kinesthetic haptic interface designed to match the properties of the human arm as closely as possible and to facilitate natural user locomotion while providing full haptic feedback. The interface incorporates a novel admittance-type force control scheme, leveraging hierarchical optimization to render both arbitrary serial kinematic chains and Cartesian admittances. Notably, the proposed control scheme natively accounts for system limitations, including joint and Cartesian constraints, as well as singularities. Experimental results demonstrate the effectiveness of HapticGiant and its control scheme, paving the way for highly immersive virtual reality applications.