🤖 AI Summary
This study addresses the limitations of traditional handcrafted user personas, which are often abstract, costly to produce, and difficult to translate into actionable design features, thereby hindering their practical utility in product design. To overcome these challenges, this work proposes the first interactive system grounded in multimodal large language models (MLLMs) that integrates demographic data with an interactive interface to enable designers to generate fine-grained user personas. The system further automatically derives and restructures these personas into structured design features, achieving an end-to-end transformation from abstract representations to concrete design elements. In an evaluation with twelve professional designers, the approach significantly outperformed a chat-based baseline in terms of persona engagement, perceived transparency, and user satisfaction.
📝 Abstract
Product designers often begin their design process with handcrafted personas. While personas are intended to ground design decisions in consumer preferences, they often fall short in practice by remaining abstract, expensive to produce, and difficult to translate into actionable design features. As a result, personas risk serving as static reference points rather than tools that actively shape design outcomes. To address these challenges, we built Personagram, an interactive system powered by multimodal large language models (MLLMs) that helps designers explore detailed census-based personas, extract product features inferred from persona attributes, and recombine them for specific customer segments. In a study with 12 professional designers, we show that Personagram facilitates more actionable ideation workflows by structuring multimodal thinking from persona attributes to product design features, achieving higher engagement with personas, perceived transparency, and satisfaction compared to a chat-based baseline. We discuss implications of integrating AI-generated personas into product design workflows.