Multi-StyleGS: Stylized Gaussian Splatting with Multiple Styles

📅 2025-04-11
🏛️ AAAI Conference on Artificial Intelligence
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of multi-style artistic editing for 3D Gaussian Splatting scenes—requiring cross-view consistency, memory efficiency, and support for both automatic and manual local style transfer. Methodologically, it introduces a novel bipartite graph matching–driven mechanism to establish correspondences between local styles and scene regions; incorporates semantic style loss and multi-scale local-global feature matching to achieve object-level style disentanglement and enhanced view consistency; and employs memory-aware optimization to significantly improve texture detail and color fidelity under constrained GPU memory. Experiments demonstrate superior performance over state-of-the-art methods on multi-style editing tasks, enabling flexible and robust interactive 3D artistic creation. Key contributions include the first formulation of style-region correspondence via bipartite matching in Gaussian splatting, effective decoupling of stylistic and geometric attributes while preserving view coherence, and efficient rendering with high visual fidelity at low memory cost.

Technology Category

Application Category

📝 Abstract
In recent years, there has been a growing demand to stylize a given 3D scene to align with the artistic style of reference images for creative purposes. While 3D Gaussian Splatting (GS) has emerged as a promising and efficient method for realistic 3D scene modeling, there remains a challenge in adapting it to stylize 3D GS to match with multiple styles through automatic local style transfer or manual designation, while maintaining memory efficiency for stylization training. In this paper, we introduce a novel 3D GS stylization solution termed Multi-StyleGS to tackle these challenges. In particular, we employ a bipartite matching mechanism to automatically identify correspondences between the style images and the local regions of the rendered images. To facilitate local style transfer, we introduce a novel semantic style loss function that employs a segmentation network to apply distinct styles to various objects of the scene and propose a local-global feature matching to enhance the multi-view consistency. Furthermore, this technique can achieve memory-efficient training, more texture details and better color match. To better assign a robust semantic label to each Gaussian, we propose several techniques to regularize the segmentation network. As demonstrated by our comprehensive experiments, our approach outperforms existing ones in producing plausible stylization results and offering flexible editing.
Problem

Research questions and friction points this paper is trying to address.

Stylizing 3D Gaussian Splatting with multiple artistic styles
Achieving memory-efficient training for multi-style 3D scene modeling
Enhancing multi-view consistency and local style transfer accuracy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bipartite matching for style-region correspondences
Semantic style loss with segmentation network
Local-global feature matching for consistency
🔎 Similar Papers
No similar papers found.
Y
Yangkai Lin
South China University of Technology
Jiabao Lei
Jiabao Lei
South China University of Technology
3D Computer Vision
K
Kui Jia
School of Data Science, The Chinese University of Hong Kong, Shenzhen