Generative and Malleable User Interfaces with Generative and Evolving Task-Driven Data Model

📅 2025-03-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing UI generation approaches rely on code auto-generation, hindering end-user iteration and limiting flexibility and adaptability. This paper proposes a task-driven, evolvable UI generation framework that employs a unified semantic representation—namely, a task-driven data model—to integrate large language model–based intent understanding, graph-structured data modeling, UI specification mapping, and bidirectional natural language–model compilation. The framework enables UI generation from natural language prompts and supports real-time, interactive updates to the underlying model via direct manipulation and conversational editing. It establishes, for the first time, an end-to-end UI generation paradigm that is interpretable, intervenable, and iterative. Technical evaluation demonstrates significant improvements in generation accuracy and consistency. A user study confirms that non-programmers can efficiently customize complex interfaces: average editing time decreases by 62%, and task completion rate increases by 41%.

Technology Category

Application Category

📝 Abstract
Unlike static and rigid user interfaces, generative and malleable user interfaces offer the potential to respond to diverse users' goals and tasks. However, current approaches primarily rely on generating code, making it difficult for end-users to iteratively tailor the generated interface to their evolving needs. We propose employing task-driven data models-representing the essential information entities, relationships, and data within information tasks-as the foundation for UI generation. We leverage AI to interpret users' prompts and generate the data models that describe users' intended tasks, and by mapping the data models with UI specifications, we can create generative user interfaces. End-users can easily modify and extend the interfaces via natural language and direct manipulation, with these interactions translated into changes in the underlying model. The technical evaluation of our approach and user evaluation of the developed system demonstrate the feasibility and effectiveness of the proposed generative and malleable UIs.
Problem

Research questions and friction points this paper is trying to address.

Dynamic user interfaces adapt to evolving user tasks.
Current methods lack user-friendly iterative customization.
AI-driven task models enable flexible UI generation and modification.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Task-driven data models for UI generation
AI interprets prompts to create data models
Natural language modifies underlying UI models
🔎 Similar Papers
No similar papers found.