Personality over Precision: Exploring the Influence of Human-Likeness on ChatGPT Use for Search

📅 2025-11-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates how “human-likeness” in conversational search influences user adoption and overtrust, specifically examining whether and why users trade factual accuracy for anthropomorphic interaction. Method: We conducted a survey with 173 dual-platform users (ChatGPT and Google), integrating multidimensional perceptual measures—including trust, anthropomorphism perception, and interface design preferences—to identify behavioral patterns. Contribution/Results: We identified two distinct user segments: daily dual-platform users exhibit higher trust in ChatGPT and systematically prioritize natural, fluent dialogue over factual accuracy; Google-dominant users report lower trust but value its ad-free environment and response reliability. Critically, this is the first empirical demonstration that anthropomorphic design—while enhancing adoption—systematically induces overtrust, thereby exposing users to accuracy-related risks. These findings provide foundational behavioral evidence and theoretical insights for ethically grounded, trustworthy AI search system design.

Technology Category

Application Category

📝 Abstract
Conversational search interfaces, like ChatGPT, offer an interactive, personalized, and engaging user experience compared to traditional search. On the downside, they are prone to cause overtrust issues where users rely on their responses even when they are incorrect. What aspects of the conversational interaction paradigm drive people to adopt it, and how it creates personalized experiences that lead to overtrust, is not clear. To understand the factors influencing the adoption of conversational interfaces, we conducted a survey with 173 participants. We examined user perceptions regarding trust, human-likeness (anthropomorphism), and design preferences between ChatGPT and Google. To better understand the overtrust phenomenon, we asked users about their willingness to trade off factuality for constructs like ease of use or human-likeness. Our analysis identified two distinct user groups: those who use both ChatGPT and Google daily (DUB), and those who primarily rely on Google (DUG). The DUB group exhibited higher trust in ChatGPT, perceiving it as more human-like, and expressed greater willingness to trade factual accuracy for enhanced personalization and conversational flow. Conversely, the DUG group showed lower trust toward ChatGPT but still appreciated aspects like ad-free experiences and responsive interactions. Demographic analysis further revealed nuanced patterns, with middle-aged adults using ChatGPT less frequently yet trusting it more, suggesting potential vulnerability to misinformation. Our findings contribute to understanding user segmentation, emphasizing the critical roles of personalization and human-likeness in conversational IR systems, and reveal important implications regarding users'willingness to compromise factual accuracy for more engaging interactions.
Problem

Research questions and friction points this paper is trying to address.

Examining how human-likeness in ChatGPT affects user trust and adoption
Investigating users' willingness to sacrifice factual accuracy for engaging interactions
Identifying user segments with different vulnerability to misinformation in conversational search
Innovation

Methods, ideas, or system contributions that make the work stand out.

Surveyed user trust and anthropomorphism in conversational interfaces
Identified user groups based on ChatGPT and Google usage patterns
Analyzed willingness to trade factual accuracy for personalization
🔎 Similar Papers
No similar papers found.
M
Mert Yazan
Amsterdam University of Applied Sciences, Amsterdam, Netherlands
F
Frederik Bungaran Ishak Situmeang
Amsterdam University of Applied Sciences, Amsterdam, Netherlands
Suzan Verberne
Suzan Verberne
Leiden Institute of Advanced Computer Science, Leiden University
Information RetrievalNatural Language Processing