🤖 AI Summary
This study addresses the limitations of current conversational AI systems, which predominantly focus on clinical diagnosis and treatment yet fall short in meeting everyday emotional support needs outside clinical settings. Drawing on spiritual care perspectives from chaplaincy, the work proposes a novel design paradigm for non-clinical dialogue AI centered on “attunement.” Through participatory design involving 18 chaplains, the research integrates qualitative interviews with chatbot prototyping to systematically explore human–AI affective interaction around four core themes: listening, connecting, holding, and willingness. The findings illuminate the current constraints of AI in providing daily emotional support and offer both a theoretical framework and practical pathways for developing well-being-oriented, non-clinical conversational agents.
📝 Abstract
Despite growing recognition that responsible AI requires domain knowledge, current work on conversational AI primarily draws on clinical expertise that prioritises diagnosis and intervention. However, much of everyday emotional support needs occur in non-clinical contexts, and therefore requires different conversational approaches. We examine how chaplains, who guide individuals through personal crises, grief, and reflection, perceive and engage with conversational AI. We recruited eighteen chaplains to build AI chatbots. While some chaplains viewed chatbots with cautious optimism, the majority expressed limitations of chatbots'ability to support everyday well-being. Our analysis reveals how chaplains perceive their pastoral care duties and areas where AI chatbots fall short, along the themes of Listening, Connecting, Carrying, and Wanting. These themes resonate with the idea of attunement, recently highlighted as a relational lens for understanding the delicate experiences care technologies provide. This perspective informs chatbot design aimed at supporting well-being in non-clinical contexts.