Mixer Metaphors: audio interfaces for non-musical applications

📅 2025-04-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work investigates whether audio interface metaphors—such as analog synthesizers and mixing consoles—can effectively transfer to embodied control of large language models (LLMs). To this end, we designed and implemented a physical interactive hardware prototype enabling artists to manipulate semantic content, stylistic attributes, and structural properties of LLM outputs in real time via knobs, faders, and other tactile controls. This represents the first systematic application of audio-domain embodied interaction paradigms to non-musical AI systems. Integrating low-latency LLM API calls and conducting a one-week comparative user study, we found that the audio-metaphor interface significantly enhances operational directness, frequency of improvisational exploration, and creative immersion relative to conventional software-only interfaces; users reported stronger perceived parameter agency and capacity for creative intervention. Our core contribution is empirical validation that cross-modal metaphor supports more embodied, intelligible, and creatively generative interaction with complex AI systems.

Technology Category

Application Category

📝 Abstract
The NIME conference traditionally focuses on interfaces for music and musical expression. In this paper we reverse this tradition to ask, can interfaces developed for music be successfully appropriated to non-musical applications? To help answer this question we designed and developed a new device, which uses interface metaphors borrowed from analogue synthesisers and audio mixing to physically control the intangible aspects of a Large Language Model. We compared two versions of the device, with and without the audio-inspired augmentations, with a group of artists who used each version over a one week period. Our results show that the use of audio-like controls afforded more immediate, direct and embodied control over the LLM, allowing users to creatively experiment and play with the device over its non-mixer counterpart. Our project demonstrates how cross-sensory metaphors can support creative thinking and embodied practice when designing new technological interfaces.
Problem

Research questions and friction points this paper is trying to address.

Can music interfaces work for non-musical applications?
How audio metaphors improve control over Large Language Models?
Do cross-sensory metaphors enhance creative interface design?
Innovation

Methods, ideas, or system contributions that make the work stand out.

Audio-inspired controls for LLM manipulation
Cross-sensory metaphors enhance creative interfaces
Mixer metaphors enable embodied AI interaction
🔎 Similar Papers
No similar papers found.