🤖 AI Summary
This paper addresses the lack of affective dimension in voice-based interaction, which undermines the sociality of intelligent agents. To bridge this gap, we propose “affective voice commands”—a composite modality jointly modeling semantic content and prosodic affect (e.g., intonation, rhythm) of spoken instructions. We implement a real-time affect-aware agent prototype and evaluate it in a retro-style dual-vehicle control game (N=14). Using voice emotion analysis, real-time behavioral adaptation, and Likert-scale assessments, we first empirically identify users’ conscious “vocal performance” during interaction—i.e., deliberate modulation of vocal expression to convey intent or emotion. Results show that affect-adaptive agents significantly enhance perceived stimulation (p<0.01) and reliance (p<0.05). These findings substantiate the critical role of affective responsiveness in improving interactional credibility and social presence.
📝 Abstract
In an era of human-computer interaction with increasingly agentic AI systems capable of connecting with users conversationally, speech is an important modality for commanding agents. By recognizing and using speech emotions (i.e., how a command is spoken), we can provide agents with the ability to emotionally accentuate their responses and socially enrich users' perceptions and experiences. To explore the concept and impact of speech emotion commands on user perceptions, we realized a prototype and conducted a user study (N = 14) where speech commands are used to steer two vehicles in a minimalist and retro game style implementation. While both agents execute user commands, only one of the agents uses speech emotion information to adapt its execution behavior. We report on differences in how users perceived each agent, including significant differences in stimulation and dependability, outline implications for designing interactions with agents using emotional speech commands, and provide insights on how users consciously emote, which we describe as"voice acting".