🤖 AI Summary
This work addresses controllable style transfer for textured 3D meshes. We propose an artist-centric multi-style fusion method designed for intuitive interactive editing. Our approach introduces: (1) a lightweight guide texture as user input, drastically lowering the interaction barrier; (2) a style-direction-aware loss function that explicitly models the spatial orientation of style features, enabling flexible guidance from single or multiple style images—including localized regions; and (3) a UV-mapping-based multi-style fusion mechanism ensuring spatial continuity and semantic smoothness across style transitions in texture space. Extensive experiments on diverse mesh topologies, style sources, and silhouette shapes demonstrate that our method preserves geometric fidelity while generating high-fidelity, finely controllable, and interactively efficient stylized textures. It significantly enhances both controllability and expressive power in 3D texture stylization.
📝 Abstract
We introduce Style Brush, a novel style transfer method for textured meshes designed to empower artists with fine-grained control over the stylization process. Our approach extends traditional 3D style transfer methods by introducing a novel loss function that captures style directionality, supports multiple style images or portions thereof, and enables smooth transitions between styles in the synthesized texture. The use of easily generated guiding textures streamlines user interaction, making our approach accessible to a broad audience. Extensive evaluations with various meshes, style images, and contour shapes demonstrate the flexibility of our method and showcase the visual appeal of the generated textures.