Benchmarking Multi-Object Grasping

📅 2025-03-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the lack of systematic evaluation standards for multi-object grasping and manipulation. We propose MOGB, the first standardized benchmark for multi-object grasping in both stacked and planar scenes, featuring three core protocols: (1) one-shot grasping (efficiency), (2) precise selective grasping and relocation (accuracy), and (3) complete scene clearing (robustness). MOGB supports diverse end-effectors—including Barrett Hand, Robotiq 2F-85, and Pisa/IIT SoftHand-2—and integrates 3D perception with motion planning modules to enable reproducible baselines and quantitative cross-platform performance comparison. Crucially, it is the first framework to jointly evaluate efficiency, accuracy, and robustness under unified conditions, while incorporating human performance as a reference benchmark. MOGB provides an extensible, reproducible, and platform-agnostic evaluation standard for robotic manipulation research.

Technology Category

Application Category

📝 Abstract
In this work, we describe a multi-object grasping benchmark to evaluate the grasping and manipulation capabilities of robotic systems in both pile and surface scenarios. The benchmark introduces three robot multi-object grasping benchmarking protocols designed to challenge different aspects of robotic manipulation. These protocols are: 1) the Only-Pick-Once protocol, which assesses the robot's ability to efficiently pick multiple objects in a single attempt; 2) the Accurate pick-trnsferring protocol, which evaluates the robot's capacity to selectively grasp and transport a specific number of objects from a cluttered environment; and 3) the Pick-transferring-all protocol, which challenges the robot to clear an entire scene by sequentially grasping and transferring all available objects. These protocols are intended to be adopted by the broader robotics research community, providing a standardized method to assess and compare robotic systems' performance in multi-object grasping tasks. We establish baselines for these protocols using standard planning and perception algorithms on a Barrett hand, Robotiq parallel jar gripper, and the Pisa/IIT Softhand-2, which is a soft underactuated robotic hand. We discuss the results in relation to human performance in similar tasks we well.
Problem

Research questions and friction points this paper is trying to address.

Evaluate robotic grasping in pile and surface scenarios
Challenge robotic manipulation with three protocols
Establish baselines using standard algorithms and grippers
Innovation

Methods, ideas, or system contributions that make the work stand out.

Only-Pick-Once protocol for efficient multi-object grasping
Accurate pick-transferring protocol for selective object transport
Pick-transferring-all protocol for complete scene clearance
🔎 Similar Papers
No similar papers found.
Tianze Chen
Tianze Chen
PHD candidate at University of South Florida
RoboticsRobotic grasping and manipulation
Giulia Pagnanelli
Giulia Pagnanelli
PhD Student, Research Center "E. Piaggio", University of Pisa
HapticsSoft Tactile SensingComputational Models for PerceptionAugmented Reality
Gianmarco Cei
Gianmarco Cei
PhD Student at Centro di Ricerca "E. Piaggio", Università di Pisa
hapticshuman biomechanicstactile sensation
S
Shahadding Gafarov
Robot Perception and Action Lab (RPAL) of Computer Science and Engineering Department, University of South Florida, Tampa, FL 33620, USA
J
Jian Gong
Robot Perception and Action Lab (RPAL) of Computer Science and Engineering Department, University of South Florida, Tampa, FL 33620, USA
Z
Zihe Ye
Rutgers University, New Brunswick, NJ 08901, USA
M
Marco Baracca
Research Center ”E. Piaggio”, Department of Information Engineering, University of Pisa, Pisa, Italy
S
Salvatore D'Avella
Department of Excellence in Robotics & AI, Mechanical Intelligence Institute, Scuola Superiore Sant’Anna, Pisa, Italy
Matteo Bianchi
Matteo Bianchi
University of Pisa
RoboticsHaptics
Y
Yu Sun
Robot Perception and Action Lab (RPAL) of Computer Science and Engineering Department, University of South Florida, Tampa, FL 33620, USA