3D PixBrush: Image-Guided Local Texture Synthesis

📅 2025-07-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the problem of user-free, image-driven local texture editing of 3D meshes—specifically, how to automatically predict semantically aligned localization masks and high-fidelity synthesized textures on 3D geometry from a single reference image. To this end, we propose a localization-modulated, image-guided mechanism that enhances Score Distillation Sampling (SDS) to enable end-to-end generation of geometry-aware local masks. Furthermore, we integrate a diffusion model with a lightweight mask prediction network to jointly ensure semantic consistency and geometric compatibility across both global context and local texture details. Extensive experiments on diverse 3D mesh categories and real-world images demonstrate that our method significantly outperforms existing unsupervised editing approaches, achieving superior performance in localization accuracy, texture fidelity, and editing consistency.

Technology Category

Application Category

📝 Abstract
We present 3D PixBrush, a method for performing image-driven edits of local regions on 3D meshes. 3D PixBrush predicts a localization mask and a synthesized texture that faithfully portray the object in the reference image. Our predicted localizations are both globally coherent and locally precise. Globally - our method contextualizes the object in the reference image and automatically positions it onto the input mesh. Locally - our method produces masks that conform to the geometry of the reference image. Notably, our method does not require any user input (in the form of scribbles or bounding boxes) to achieve accurate localizations. Instead, our method predicts a localization mask on the 3D mesh from scratch. To achieve this, we propose a modification to the score distillation sampling technique which incorporates both the predicted localization and the reference image, referred to as localization-modulated image guidance. We demonstrate the effectiveness of our proposed technique on a wide variety of meshes and images.
Problem

Research questions and friction points this paper is trying to address.

Image-driven local texture editing on 3D meshes
Automatic localization mask prediction without user input
Globally coherent and locally precise texture synthesis
Innovation

Methods, ideas, or system contributions that make the work stand out.

Image-driven local texture synthesis on 3D meshes
Localization-modulated image guidance technique
Automatic mask prediction without user input
🔎 Similar Papers
No similar papers found.