Conversational Decision Support for Information Search Under Uncertainty: Effects of Gist and Verbatim Feedback

📅 2026-02-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of suboptimal human decision-making under uncertainty, where individuals often struggle to determine when to stop information search due to high cognitive load. While existing AI assistance primarily focuses on optimizing final outcomes, it frequently neglects low-burden support during the search process itself. To bridge this gap, the authors propose SERA, a conversational decision-support system powered by large language models that introduces feedback granularity—summary versus verbatim—as a core design variable. SERA generates real-time, cognitively grounded information-gain feedback across three uncertainty scenarios. Experimental results demonstrate that SERA significantly improves both decision accuracy and confidence: summary feedback curbs over-sampling, while verbatim feedback encourages deeper exploration, with pronounced benefits under high uncertainty. This approach establishes a novel paradigm for adaptive, process-oriented decision support.

Technology Category

Application Category

📝 Abstract
Many real-world decisions rely on information search, where people sample evidence and decide when to stop under uncertainty. The uncertainty in the environment, particularly how diagnostic evidence is distributed, causes complexities in information search, further leading to suboptimal decision-making outcomes. Yet AI decision support often targets outcome optimization, and less is known about how to scaffold search without increasing cognitive load. We introduce SERA, an LLM-based assistant that provides either gist or verbatim feedback during search. Across two experiments (N1=54, N2=54), we examined decision-making outcomes and information search in SERA-Gist, SERA-Verbatim, and a no-feedback baseline across three environments varying in uncertainty. The uncertainty in environment is operationalized by the perceived gain of information across the course of sampling, which individuals may experience diminishing return of information gain (decremental; low-uncertainty), or a local drop of information gain (local optimum; medium-uncertainty), or no patterns in information gain (high-uncertainty), as they search more. Individuals show more accurate decision outcomes and are more confident with SERA support, especially under higher uncertainty. Gist feedback was associated with more efficient integration and showed a descriptive pattern of reduced oversampling, while verbatim feedback promoted more extensive exploration. These findings establish feedback representation as a design lever when search matters, motivating adaptive systems that match feedback granularity to uncertainty.
Problem

Research questions and friction points this paper is trying to address.

information search
decision-making under uncertainty
cognitive load
feedback representation
evidence sampling
Innovation

Methods, ideas, or system contributions that make the work stand out.

conversational decision support
gist vs. verbatim feedback
information search under uncertainty
LLM-based assistant
adaptive feedback granularity
🔎 Similar Papers
No similar papers found.