ClearFairy: Capturing Creative Workflows through Decision Structuring, In-Situ Questioning, and Rationale Inference

📅 2025-09-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing methods struggle to comprehensively capture the implicit professional reasoning underlying creative work, resulting in non-auditable decisions and poor knowledge transfer. This paper introduces CLEAR—a cognitive-aware framework—and ClearFairy, an intelligent assistant that jointly models cognitive decision structures and lightweight human–AI interaction. Integrated into UI design tools, ClearFairy enables real-time implicit rationale completion and augmentation via context-aware questioning, natural language inference, and generation. In a user study with 12 professional designers, 85% of generated rationales were accepted, and the proportion of strongly explanatory rationales increased significantly—from 14% to over 83%. Consequently, generative AI predictions improved in both accuracy and output coherence. This work establishes a novel paradigm for explainable AI (XAI) and human–AI co-design, advancing traceability, interpretability, and collaborative creativity in design workflows.

Technology Category

Application Category

📝 Abstract
Capturing professionals' decision-making in creative workflows is essential for reflection, collaboration, and knowledge sharing, yet existing methods often leave rationales incomplete and implicit decisions hidden. To address this, we present CLEAR framework that structures reasoning into cognitive decision steps-linked units of actions, artifacts, and self-explanations that make decisions traceable. Building on this framework, we introduce ClearFairy, a think-aloud AI assistant for UI design that detects weak explanations, asks lightweight clarifying questions, and infers missing rationales to ease the knowledge-sharing burden. In a study with twelve creative professionals, 85% of ClearFairy's inferred rationales were accepted, increasing strong explanations from 14% to over 83% of decision steps without adding cognitive demand. The captured steps also enhanced generative AI agents in Figma, yielding next-action predictions better aligned with professionals and producing more coherent design outcomes. For future research on human knowledge-grounded creative AI agents, we release a dataset of captured 417 decision steps.
Problem

Research questions and friction points this paper is trying to address.

Capturing incomplete rationales in creative workflows
Making implicit decisions traceable and explicit
Reducing knowledge-sharing burden through AI assistance
Innovation

Methods, ideas, or system contributions that make the work stand out.

Structures reasoning into cognitive decision steps
Detects weak explanations and asks clarifying questions
Infers missing rationales to ease knowledge-sharing burden
🔎 Similar Papers
No similar papers found.