🤖 AI Summary
This study addresses the limitations of current generative AI systems in mental health crisis scenarios, which often adopt risk-averse strategies that withhold substantive support, potentially discouraging help-seeking and undermining public mental health. Moving beyond safety paradigms dominated by liability avoidance, this work proposes a novel framework centered on user empowerment by integrating the community helper model into generative AI design for the first time. By synthesizing generative dialogue systems, community helping theory, and human–computer interaction principles—alongside aligned policy and regulatory mechanisms—the approach enables AI to serve as a supportive bridge during early crisis stages, helping users de-escalate distress and connect with professional resources. This research offers an innovative pathway to enhance both the safety and efficacy of generative AI in mental health interventions, with the potential to increase help-seeking behaviors and improve population-level mental well-being.
📝 Abstract
People experiencing mental health crises frequently turn to open-ended generative AI (GenAI) chatbots such as ChatGPT for support. However, rather than providing immediate assistance, most GenAI chatbots are designed to respond to crisis situations in ways that minimize their developers'liability, primarily through avoidance (e.g., refusing to engage beyond templated referrals to crisis hotlines). Withholding crisis support in these cases may harm users who have no viable alternatives and reduce their motivation to seek further help. At scale, this avoidant design could undermine population mental health. We propose empowerment-oriented design principles for AI crisis support, informed by community helper models. As an initial touchpoint in help-seeking, AI chatbots can act as a supportive bridge to de-escalate crises and connect users to more reliable care. Coordination between AI developers and regulators can enable a better balance of risk mitigation and user empowerment in AI crisis support.