🤖 AI Summary
Addressing peer resource scarcity and relapse risk in addiction recovery, this study tackles the need for trustworthy, empathetic, and identity-grounded AI interventions. Method: We propose RecoveryTeller, a large language model–based chatbot embodying the lived identity of a recovered individual—the first system to integrate authentic recovery narratives with a consistent, personified identity in mental health AI. Deployed over 20 days with 26 participants, its efficacy was evaluated via a crossover controlled experiment combining qualitative and quantitative analysis. Contribution/Results: RecoveryTeller significantly outperformed conventional mentor-style bots in emotional resonance; it uniquely bridges affective and cognitive trust, revealing their inherent tension and advancing personified AI design paradigms. Users perceived RecoveryTeller and human mentors as complementary—not substitutive—support sources. This work extends the responsible, safe, and embodied deployment of LLMs in sensitive psychological intervention contexts.
📝 Abstract
Peer recovery narratives provide unique benefits beyond professional or lay mentoring by fostering hope and sustained recovery in eating disorder (ED) contexts. Yet, such support is limited by the scarcity of peer-involved programs and potential drawbacks on recovered peers, including relapse risk. To address this, we designed RecoveryTeller, a chatbot adopting a recovered-peer persona that portrays itself as someone recovered from an ED. We examined whether such a persona can reproduce the support affordances of peer recovery narratives. We compared RecoveryTeller with a lay-mentor persona chatbot offering similar guidance but without a recovery background. We conducted a 20-day cross-over deployment study with 26 ED participants, each using both chatbots for 10 days. RecoveryTeller elicited stronger emotional resonance than a lay-mentor chatbot, yet tensions between emotional and epistemic trust led participants to view the two personas as complementary rather than substitutes. We provide design implications for mental health chatbot persona design.