Should AI Mimic People? Understanding AI-Supported Writing Technology Among Black Users

📅 2025-05-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study identifies systemic cultural incompetence in AI-supported writing technologies (AISWT) when serving Black American users—particularly through misidentification and omission of African American Vernacular English (AAVE) lexicon, naming conventions, and pragmatic norms, thereby exacerbating linguistic marginalization. Drawing on in-depth interviews and contextual usage observations with 13 Black participants, augmented by thematic coding analysis, the study empirically uncovers three recurrent failure modes: “naming misjudgment,” “pragmatic negation,” and “cultural alienation.” It further introduces the “mimicry–representational absence” ethical tension framework to analyze these failures. Crucially, the work shifts the discourse on linguistic inclusivity from abstract fairness debates to concrete technical practice. Its primary contribution lies in establishing evaluation and design principles for AISWT grounded in accuracy, cultural authenticity, and social justice—thereby advancing equity-centered NLP system development.

Technology Category

Application Category

📝 Abstract
AI-supported writing technologies (AISWT) that provide grammatical suggestions, autocomplete sentences, or generate and rewrite text are now a regular feature integrated into many people's workflows. However, little is known about how people perceive the suggestions these tools provide. In this paper, we investigate how Black American users perceive AISWT, motivated by prior findings in natural language processing that highlight how the underlying large language models can contain racial biases. Using interviews and observational user studies with 13 Black American users of AISWT, we found a strong tradeoff between the perceived benefits of using AISWT to enhance their writing style and feeling like"it wasn't built for us". Specifically, participants reported AISWT's failure to recognize commonly used names and expressions in African American Vernacular English, experiencing its corrections as hurtful and alienating and fearing it might further minoritize their culture. We end with a reflection on the tension between AISWT that fail to include Black American culture and language, and AISWT that attempt to mimic it, with attention to accuracy, authenticity, and the production of social difference.
Problem

Research questions and friction points this paper is trying to address.

Investigates Black users' perceptions of AI writing tools' biases
Explores tradeoffs between writing enhancement and cultural exclusion
Examines racial bias in AI's language corrections and suggestions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Investigates Black users' perceptions of AI writing tools
Highlights racial biases in large language models
Explores cultural inclusivity in AI writing suggestions
🔎 Similar Papers
No similar papers found.