Show Me the Infographic I Imagine: Intent-Aware Infographic Retrieval for Authoring Support

πŸ“… 2026-04-09
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the ambiguity inherent in users’ natural language expressions of infographic design intent and the inadequacy of existing retrieval methods in handling their multi-component, text-intensive nature. The authors propose an intent-aware infographic retrieval framework that first establishes a taxonomy of user design intents specific to infographics, transforming free-text queries into structured intent cues. An interactive agent then bridges high-level editing intentions with low-level visual designs to enable effective retrieval. By integrating qualitative user studies, intent modeling, and multimodal retrieval optimization, the proposed approach significantly outperforms baseline methods in retrieval quality, intent satisfaction, and creative efficiency, offering a precise and effective intelligent assistant for infographic creation.
πŸ“ Abstract
While infographics have become a powerful medium for communicating data-driven stories, authoring them from scratch remains challenging, especially for novice users. Retrieving relevant exemplars from a large corpus can provide design inspiration and promote reuse, substantially lowering the barrier to infographic authoring. However, effective retrieval is difficult because users often express design intent in ambiguous natural language, while infographics embody rich and multi-faceted visual designs. As a result, keyword-based search often fails to capture design intent, and general-purpose vision-language retrieval models trained on natural images are ill-suited to the text-heavy, multi-component nature of infographics. To address these challenges, we develop an intent-aware infographic retrieval framework that better aligns user queries with infographic designs. We first conduct a formative study of how people describe infographics and derive an intent taxonomy spanning content and visual design facets. This taxonomy is then leveraged to enrich and refine free-form user queries, guiding the retrieval process with intent-specific cues. Building on the retrieved exemplars, users can adapt the designs to their own data with high-level edit intents, supported by an interactive agent that performs low-level adaptation. Both quantitative evaluations and user studies are conducted to demonstrate that our method improves retrieval quality over baseline methods while better supporting intent satisfaction and efficient infographic authoring.
Problem

Research questions and friction points this paper is trying to address.

infographic retrieval
design intent
vision-language retrieval
authoring support
natural language query
Innovation

Methods, ideas, or system contributions that make the work stand out.

intent-aware retrieval
infographic authoring
vision-language alignment
design intent taxonomy
interactive adaptation
πŸ”Ž Similar Papers
No similar papers found.