🤖 AI Summary
To mitigate safety risks—such as toxic chemical exposure—arising from operational errors (e.g., improper vial capping or rack insertion) in laboratory robotics, this paper proposes an autonomous execution framework integrating multimodal perception (vision and force sensing) with a behavior tree architecture. At the task planning level, the framework employs behavior trees to ensure interpretable and formally verifiable decision logic; at the execution level, it synchronously fuses multi-source sensor feedback for real-time state monitoring and error detection. Its key innovation lies in deeply embedding perception-based closed-loop validation directly into behavior tree nodes, enabling online execution verification and adaptive recovery. Evaluated in realistic chemistry lab settings, the framework achieves 88% success rate for vial capping and 92% for rack insertion, while significantly improving error detection accuracy. This enhances both the safety and reliability of automated laboratory experimentation.
📝 Abstract
Laboratory robotics offer the capability to conduct experiments with a high degree of precision and reproducibility, with the potential to transform scientific research. Trivial and repeatable tasks; e.g., sample transportation for analysis and vial capping are well-suited for robots; if done successfully and reliably, chemists could contribute their efforts towards more critical research activities. Currently, robots can perform these tasks faster than chemists, but how reliable are they? Improper capping could result in human exposure to toxic chemicals which could be fatal. To ensure that robots perform these tasks as accurately as humans, sensory feedback is required to assess the progress of task execution. To address this, we propose a novel methodology based on behaviour trees with multimodal perception. Along with automating robotic tasks, this methodology also verifies the successful execution of the task, a fundamental requirement in safety-critical environments. The experimental evaluation was conducted on two lab tasks: sample vial capping and laboratory rack insertion. The results show high success rate, i.e., 88% for capping and 92% for insertion, along with strong error detection capabilities. This ultimately proves the robustness and reliability of our approach and that using multimodal behaviour trees should pave the way towards the next generation of robotic chemists.