🤖 AI Summary
To address the insufficient co-exploration of visual and functional aspects in conceptual product design, this paper proposes a multimodal generative method integrating large language models (LLMs) and image-based generative AI (GenAI). The method establishes a bidirectional “functional description ↔ visual component” mapping workflow, enabling sketch- and text-driven decomposition, generation, and interactive editing of design concepts. Innovatively, it is the first to deeply couple LLM-generated functional semantic parsing with diffusion-model-driven visual synthesis, achieving cross-modal alignment and recomposition between functionality and appearance. Extensive multi-scenario experiments and user studies demonstrate significant improvements in design efficiency (+37%), creative inspiration (average user creativity score increase of 2.1/5), and usability (System Usability Scale score of 84.6). The approach establishes a novel, systematic, and interpretable paradigm for conceptual design.
📝 Abstract
Conceptual product design requires designers to explore the design space of visual and functional concepts simultaneously. Sketching has long been adopted to empower concept exploration. However, current sketch-based design tools mostly emphasize visual design using emerging techniques. We present SketchConcept, a design support tool that decomposes design concepts into visual representations and functionality of concepts using sketches and textual descriptions. We propose a function-to-visual mapping workflow that maps the function descriptions generated by a Large Language Model to a component of the concept produced by image Generative Artificial Intelligence(GenAI). The function-to-visual mapping allows our system to leverage multimodal GenAI to decompose, generate, and edit the design concept to satisfy the overall function and behavior. We present multiple use cases enabled by SketchConcept to validate the workflow. Finally, we evaluated the efficacy and usability of our system with a two-session user study.