"If we misunderstand the client, we misspend 100 hours": Exploring conversational AI and response types for information elicitation

📅 2025-06-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Misinterpretation of client requirements in design practice leads to significant resource waste. Method: This study proposes a digital requirements elicitation method integrating conversational AI with a choice-based response mechanism to enhance requirement alignment efficiency during early client–designer collaboration. A three-phase empirical investigation—including a 2×2 factorial experiment, semi-structured interviews, and the User Experience Questionnaire (UEQ)—was conducted; notably, this work pioneers the integration of structured responses into conversational AI interactions to establish a bidirectional, collaborative requirements elicitation paradigm. Contribution/Results: The method significantly improves the clarity of client inputs and accuracy of requirement articulation, while strengthening both parties’ readiness for early-stage collaboration. Although it slightly reduces perceived system dependence, it overall enhances both the quality and efficiency of requirements elicitation. The study delivers a reusable methodology and tool framework for human–AI collaborative requirements engineering in professional design domains.

Technology Category

Application Category

📝 Abstract
Client-designer alignment is crucial to the success of design projects, yet little research has explored how digital technologies might influence this alignment. To address this gap, this paper presents a three-phase study investigating how digital systems can support requirements elicitation in professional design practice. Specifically, it examines how integrating a conversational agent and choice-based response formats into a digital elicitation tool affects early-stage client-designer collaboration. The first phase of the study inquired into the current practices of 10 design companies through semi-structured interviews, informing the system's design. The second phase evaluated the system using a 2x2 factorial design with 50 mock clients, quantifying the effects of conversational AI and response type on user experience and perceived preparedness. In phase three, the system was presented to seven of the original 10 companies to gather reflections on its value, limitations, and potential integration into practice. Findings show that both conversational AI and choice-based responses lead to lower dependability scores on the User Experience Questionnaire, yet result in client input with greater clarity. We contribute design implications for integrating conversational AI and choice-based responses into elicitation tools to support mutual understanding in early-stage client-designer collaboration.
Problem

Research questions and friction points this paper is trying to address.

Exploring how digital systems improve client-designer alignment
Investigating conversational AI's impact on requirements elicitation
Evaluating choice-based responses for clearer client input
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrates conversational AI for requirements elicitation
Uses choice-based response formats for clarity
Evaluates system with UX Questionnaire metrics
🔎 Similar Papers
No similar papers found.
D
Daniel Hove Paludan
Aalborg University, Denmark
J
Julie Fredsgaard
Aalborg University, Denmark
K
Kasper Patrick Bahrentz
Aalborg University, Denmark
Ilhan Aslan
Ilhan Aslan
Associate Professor, Aalborg University
Intelligent User InterfacesHuman-Computer InteractionHuman-Centered AI