Style Brush: Guided Style Transfer for 3D Objects

📅 2025-10-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses controllable style transfer for textured 3D meshes. We propose an artist-centric multi-style fusion method designed for intuitive interactive editing. Our approach introduces: (1) a lightweight guide texture as user input, drastically lowering the interaction barrier; (2) a style-direction-aware loss function that explicitly models the spatial orientation of style features, enabling flexible guidance from single or multiple style images—including localized regions; and (3) a UV-mapping-based multi-style fusion mechanism ensuring spatial continuity and semantic smoothness across style transitions in texture space. Extensive experiments on diverse mesh topologies, style sources, and silhouette shapes demonstrate that our method preserves geometric fidelity while generating high-fidelity, finely controllable, and interactively efficient stylized textures. It significantly enhances both controllability and expressive power in 3D texture stylization.

Technology Category

Application Category

📝 Abstract
We introduce Style Brush, a novel style transfer method for textured meshes designed to empower artists with fine-grained control over the stylization process. Our approach extends traditional 3D style transfer methods by introducing a novel loss function that captures style directionality, supports multiple style images or portions thereof, and enables smooth transitions between styles in the synthesized texture. The use of easily generated guiding textures streamlines user interaction, making our approach accessible to a broad audience. Extensive evaluations with various meshes, style images, and contour shapes demonstrate the flexibility of our method and showcase the visual appeal of the generated textures.
Problem

Research questions and friction points this paper is trying to address.

Enables fine-grained artistic control over 3D mesh stylization
Extends traditional methods with directional style loss function
Supports multiple style images and smooth transitions between them
Innovation

Methods, ideas, or system contributions that make the work stand out.

Novel loss function captures style directionality
Supports multiple style images and smooth transitions
Uses guiding textures for streamlined user interaction
🔎 Similar Papers
No similar papers found.
Á
Áron Samuel Kovács
TU Wien, Austria
Pedro Hermosilla
Pedro Hermosilla
TU Wien
3D Machine Learning3D Computer VisionPoint Cloud ProcessingComputer Graphics
R
Renata G. Raidou
TU Wien, Austria