🤖 AI Summary
This study investigates how robots influence human cooperation and trust within human-robot hybrid groups, using the public goods game (PGG) framework. We designed a controlled PGG experiment integrating the humanoid robot iCub, systematically manipulating its strategy—cooperation, free-riding, or tit-for-tat—while measuring human contributions to the public pool through behavioral observation and quantitative analysis. Our key contribution is the first empirical demonstration, in a controlled game-theoretic setting, of how robotic strategies modulate the emergence of interpersonal trust and collective cooperation. Results show that although participants perceived robots as more generous, their own cooperative investments decreased; critically, robot strategy significantly modulated human behavior, with the tit-for-tat strategy most effectively enhancing group-level cooperation stability. These findings provide both theoretical grounding and an experimental paradigm for designing trustworthy human-robot collaborative systems.
📝 Abstract
In this study, we explore the potential of Game Theory as a means to investigate cooperation and trust in human-robot mixed groups. Particularly, we introduce the Public Good Game (PGG), a model highlighting the tension between individual self-interest and collective well-being. In this work, we present a modified version of the PGG, where three human participants engage in the game with the humanoid robot iCub to assess whether various robot game strategies (e.g., always cooperate, always free ride, and tit-for-tat) can influence the participants' inclination to cooperate. We test our setup during a pilot study with nineteen participants. A preliminary analysis indicates that participants prefer not to invest their money in the common pool, despite they perceive the robot as generous. By conducting this research, we seek to gain valuable insights into the role that robots can play in promoting trust and cohesion during human-robot interactions within group contexts. The results of this study may hold considerable potential for developing social robots capable of fostering trust and cooperation within mixed human-robot groups.