🤖 AI Summary
This study investigates how speech modality (text, neutral speech, expressive speech) and dialogue type (open, impersonal, personal) influence psychosocial outcomes of AI chatbot use. A four-week randomized controlled trial with 981 participants generated over 300,000 real-world interaction logs, analyzed via validated psychometric instruments (e.g., UCLA Loneliness Scale), longitudinal modeling, and moderated mediation analysis. Results reveal: (1) higher usage frequency consistently exacerbates loneliness, AI emotional dependence, and problematic use while diminishing real-world social engagement; (2) neutral speech intensifies dependence under high usage intensity; (3) personalized dialogue reduces dependence but slightly increases loneliness; and (4) individuals with higher attachment anxiety report greater loneliness, whereas those with higher trust propensity exhibit stronger emotional dependence on AI. This work provides the first empirical evidence of dynamic interplay between vocal expressivity and user behavior, offering foundational insights for ethically grounded design and targeted interventions in AI social agents.
📝 Abstract
AI chatbots, especially those with voice capabilities, have become increasingly human-like, with more users seeking emotional support and companionship from them. Concerns are rising about how such interactions might impact users' loneliness and socialization with real people. We conducted a four-week randomized, controlled, IRB-approved experiment (n=981,>300K messages) to investigate how AI chatbot interaction modes (text, neutral voice, and engaging voice) and conversation types (open-ended, non-personal, and personal) influence psychosocial outcomes such as loneliness, social interaction with real people, emotional dependence on AI and problematic AI usage. Results showed that while voice-based chatbots initially appeared beneficial in mitigating loneliness and dependence compared with text-based chatbots, these advantages diminished at high usage levels, especially with a neutral-voice chatbot. Conversation type also shaped outcomes: personal topics slightly increased loneliness but tended to lower emotional dependence compared with open-ended conversations, whereas non-personal topics were associated with greater dependence among heavy users. Overall, higher daily usage - across all modalities and conversation types - correlated with higher loneliness, dependence, and problematic use, and lower socialization. Exploratory analyses revealed that those with stronger emotional attachment tendencies and higher trust in the AI chatbot tended to experience greater loneliness and emotional dependence, respectively. These findings underscore the complex interplay between chatbot design choices (e.g., voice expressiveness) and user behaviors (e.g., conversation content, usage frequency). We highlight the need for further research on whether chatbots' ability to manage emotional content without fostering dependence or replacing human relationships benefits overall well-being.