🤖 AI Summary
This work addresses the challenge of online estimating unknown load mass, inertial parameters, and individual robot grasping poses during cooperative transport by multiple mobile collaborative robots (mocobots). To this end, we propose a fully coupled parameter estimation algorithm leveraging coordinated motion excitation and multi-point wrench/twist measurements. The method integrates rigid-body dynamics modeling, least-squares optimization, and distributed sensor fusion, relying solely on onboard proprioceptive sensing—i.e., joint/tip force-torque and motion-state measurements—without requiring prior knowledge of grasping geometry or external perception systems. Evaluated on a three-robot experimental platform, the algorithm achieves real-time joint estimation of load center-of-mass position, inertia matrix, and grasping transformation matrices, with average estimation errors below 4.2%. To the best of our knowledge, this is the first approach to realize full-coupled, online identification of multi-point dynamic parameters under completely unknown grasping configurations.
📝 Abstract
Consider the following scenario: a human guides multiple mobile manipulators to grasp a common payload. For subsequent high-performance autonomous manipulation of the payload by the mobile manipulator team, or for collaborative manipulation with the human, the robots should be able to discover where the other robots are attached to the payload, as well as the payload's mass and inertial properties. In this paper, we describe a method for the robots to autonomously discover this information. The robots cooperatively manipulate the payload, and the twist, twist derivative, and wrench data at their grasp frames are used to estimate the transformation matrices between the grasp frames, the location of the payload's center of mass, and the payload's inertia matrix. The method is validated experimentally with a team of three mobile cobots, or mocobots.