Learning with Challenges: Adaptive Difficulty-Aware Data Generation for Mobile GUI Agent Training

πŸ“… 2026-01-30
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the limitation in existing mobile GUI agent training data, which lacks fine-grained control over task difficulty, often resulting in a mismatch between task complexity and agent capabilities that hinders effective learning. To overcome this, the authors propose MobileGen, a novel framework that decouples task difficulty into structural and semantic dimensions for the first time. MobileGen employs a multi-agent controllable generator to dynamically model the agent’s capability boundary and adaptively synthesizes high-quality interaction trajectories and task instructions aligned with the agent’s current proficiency through distribution-aware sampling. This enables difficulty-adaptive curriculum learning tailored to mobile GUI environments. Experimental results demonstrate that the proposed approach improves agent performance by an average of 1.57Γ— across multiple challenging benchmarks, significantly outperforming existing data generation strategies.

Technology Category

Application Category

πŸ“ Abstract
Large-scale, high-quality interaction trajectories are essential for advancing mobile Graphical User Interface (GUI) agents. While existing methods typically rely on labor-intensive human demonstrations or automated model exploration to generate GUI trajectories, they lack fine-grained control over task difficulty. This fundamentally restricts learning effectiveness due to the mismatch between the training difficulty and the agent's capabilities. Inspired by how humans acquire skills through progressively challenging tasks, we propose MobileGen, a novel data generation framework that adaptively aligns training difficulty with the GUI agent's capability frontier. Specifically, MobileGen explicitly decouples task difficulty into structural (e.g., trajectory length) and semantic (e.g., task goal) dimensions. It then iteratively evaluates the agent on a curated prior dataset to construct a systematic profile of its capability frontier across these two dimensions. With this profile, the probability distribution of task difficulty is adaptively computed, from which the target difficulty for the next round of training can be sampled. Guided by the sampled difficulty, a multi-agent controllable generator is finally used to synthesize high-quality interaction trajectories along with corresponding task instructions. Extensive experiments show that MobileGen consistently outperforms existing data generation methods by improving the average performance of GUI agents by 1.57 times across multiple challenging benchmarks. This highlights the importance of capability-aligned data generation for effective mobile GUI agent training.
Problem

Research questions and friction points this paper is trying to address.

mobile GUI agent
task difficulty
data generation
capability alignment
interaction trajectories
Innovation

Methods, ideas, or system contributions that make the work stand out.

adaptive difficulty
capability-aligned training
mobile GUI agent
controllable data generation
difficulty-aware synthesis
πŸ”Ž Similar Papers
No similar papers found.
L
Linjia Kang
Tsinghua University
Z
Zhimin Wang
Tsinghua University
Y
Yongkang Zhang
Huazhong Agricultural University
D
Duo Wu
Tsinghua University
J
Jinghe Wang
Tsinghua University
M
Ming Ma
Kuaishou Technology
H
Haopeng Yan
Kuaishou Technology
Zhi Wang
Zhi Wang
Associate Professor, SIGS, Tsinghua University
multimedia networkedge computingdistributed machine learning