ARAP-GS: Drag-driven As-Rigid-As-Possible 3D Gaussian Splatting Editing with Diffusion Prior

๐Ÿ“… 2025-04-17
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work addresses the lack of efficient, shape-preserving interactive geometric editing methods for 3D Gaussian Splatting (3DGS) scenes. We propose the first drag-based editing framework grounded in As-Rigid-As-Possible (ARAP) constraints, directly applying ARAP deformations to the Gaussian ellipsoid parameter space to enable control-point-driven structural reshaping. To preserve deformation fidelity and rendering quality, we integrate diffusion-model-guided super-resolution optimization with multi-view consistency constraints. Our approach retains the native expressive advantages of 3DGS while significantly improving editing robustness and visual continuity across frames. Experiments demonstrate state-of-the-art performance on diverse complex scenes, achieving high-quality edits in only 10โ€“20 minutes on a single RTX 3090 GPUโ€”delivering superior efficiency, geometric accuracy, and cross-view consistency compared to existing methods.

Technology Category

Application Category

๐Ÿ“ Abstract
Drag-driven editing has become popular among designers for its ability to modify complex geometric structures through simple and intuitive manipulation, allowing users to adjust and reshape content with minimal technical skill. This drag operation has been incorporated into numerous methods to facilitate the editing of 2D images and 3D meshes in design. However, few studies have explored drag-driven editing for the widely-used 3D Gaussian Splatting (3DGS) representation, as deforming 3DGS while preserving shape coherence and visual continuity remains challenging. In this paper, we introduce ARAP-GS, a drag-driven 3DGS editing framework based on As-Rigid-As-Possible (ARAP) deformation. Unlike previous 3DGS editing methods, we are the first to apply ARAP deformation directly to 3D Gaussians, enabling flexible, drag-driven geometric transformations. To preserve scene appearance after deformation, we incorporate an advanced diffusion prior for image super-resolution within our iterative optimization process. This approach enhances visual quality while maintaining multi-view consistency in the edited results. Experiments show that ARAP-GS outperforms current methods across diverse 3D scenes, demonstrating its effectiveness and superiority for drag-driven 3DGS editing. Additionally, our method is highly efficient, requiring only 10 to 20 minutes to edit a scene on a single RTX 3090 GPU.
Problem

Research questions and friction points this paper is trying to address.

Enables drag-driven editing for 3D Gaussian Splatting representation
Preserves shape coherence and visual continuity during deformation
Integrates diffusion prior for high-quality multi-view consistent results
Innovation

Methods, ideas, or system contributions that make the work stand out.

ARAP deformation for 3D Gaussian Splatting
Diffusion prior for image super-resolution
Drag-driven 3DGS editing framework
๐Ÿ”Ž Similar Papers
No similar papers found.
X
Xiao Han
Nanjing University
R
Runze Tian
Nanjing University
Y
Yifei Tong
Nanjing University
Fenggen Yu
Fenggen Yu
Applied Scientist at Amazon
Computer GraphicsComputer Vision
D
Dingyao Liu
Nanjing University
Y
Yan Zhang
Nanjing University