DynaDrag: Dynamic Drag-Style Image Editing by Motion Prediction

πŸ“… 2026-01-02
πŸ›οΈ arXiv.org
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work proposes DynaDrag, the first drag-based image editing method built upon a β€œpredict-and-move” framework, addressing the limitations of traditional approaches that often suffer from tracking failures or domain mismatches between source and target regions. By iteratively performing motion prediction and motion supervision while dynamically adjusting active control points, DynaDrag achieves high-quality, pixel-level editing without relying on conventional tracking mechanisms. The introduction of a dynamic control point strategy significantly enhances both the plausibility and controllability of edits. Extensive experiments on face and human datasets demonstrate that DynaDrag consistently outperforms existing methods in terms of visual naturalness and structural consistency.

Technology Category

Application Category

πŸ“ Abstract
To achieve pixel-level image manipulation, drag-style image editing which edits images using points or trajectories as conditions is attracting widespread attention. Most previous methods follow move-and-track framework, in which miss tracking and ambiguous tracking are unavoidable challenging issues. Other methods under different frameworks suffer from various problems like the huge gap between source image and target edited image as well as unreasonable intermediate point which can lead to low editability. To avoid these problems, we propose DynaDrag, the first dragging method under predict-and-move framework. In DynaDrag, Motion Prediction and Motion Supervision are performed iteratively. In each iteration, Motion Prediction first predicts where the handle points should move, and then Motion Supervision drags them accordingly. We also propose to dynamically adjust the valid handle points to further improve the performance. Experiments on face and human datasets showcase the superiority over previous works.
Problem

Research questions and friction points this paper is trying to address.

drag-style image editing
motion prediction
pixel-level manipulation
tracking ambiguity
image editability
Innovation

Methods, ideas, or system contributions that make the work stand out.

drag-style editing
motion prediction
predict-and-move framework
dynamic handle adjustment
image manipulation
πŸ”Ž Similar Papers
No similar papers found.