Tune-Your-Style: Intensity-tunable 3D Style Transfer with Gaussian Splatting

πŸ“… 2026-01-31
πŸ“ˆ Citations: 1
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Existing 3D style transfer methods struggle to flexibly balance content preservation and stylistic expression, limiting users’ ability to customize style intensity. To address this, we propose a novel intensity-controllable 3D style transfer framework that, for the first time, models style intensity using Gaussian neurons and introduces a learnable style modulator together with a cross-view alignment guidance mechanism. Built upon 3D Gaussian Splatting (3DGS) and diffusion models, our approach employs a two-stage optimization strategy that fuses full-style and style-free guidance signals to produce multi-view consistent stylized results. Experiments demonstrate that our method achieves superior visual quality and unprecedented controllability, enabling continuous adjustment of style intensity and significantly enhancing the customization capability of 3D style transfer.

Technology Category

Application Category

πŸ“ Abstract
3D style transfer refers to the artistic stylization of 3D assets based on reference style images. Recently, 3DGS-based stylization methods have drawn considerable attention, primarily due to their markedly enhanced training and rendering speeds. However, a vital challenge for 3D style transfer is to strike a balance between the content and the patterns and colors of the style. Although the existing methods strive to achieve relatively balanced outcomes, the fixed-output paradigm struggles to adapt to the diverse content-style balance requirements from different users. In this work, we introduce a creative intensity-tunable 3D style transfer paradigm, dubbed \textbf{Tune-Your-Style}, which allows users to flexibly adjust the style intensity injected into the scene to match their desired content-style balance, thus enhancing the customizability of 3D style transfer. To achieve this goal, we first introduce Gaussian neurons to explicitly model the style intensity and parameterize a learnable style tuner to achieve intensity-tunable style injection. To facilitate the learning of tunable stylization, we further propose the tunable stylization guidance, which obtains multi-view consistent stylized views from diffusion models through cross-view style alignment, and then employs a two-stage optimization strategy to provide stable and efficient guidance by modulating the balance between full-style guidance from the stylized views and zero-style guidance from the initial rendering. Extensive experiments demonstrate that our method not only delivers visually appealing results, but also exhibits flexible customizability for 3D style transfer. Project page is available at https://zhao-yian.github.io/TuneStyle.
Problem

Research questions and friction points this paper is trying to address.

3D style transfer
style intensity
content-style balance
customizability
Gaussian Splatting
Innovation

Methods, ideas, or system contributions that make the work stand out.

intensity-tunable style transfer
Gaussian Splatting
3D stylization
diffusion-based guidance
style-content balance
πŸ”Ž Similar Papers
No similar papers found.
Yian Zhao
Yian Zhao
Peking University
3D Gaussian SplattingMLLM
R
Rushi Ye
School of Electronic and Computer Engineering, Peking University, Shenzhen, China
R
Ruochong Zheng
School of Electronic and Computer Engineering, Peking University, Shenzhen, China
Zesen Cheng
Zesen Cheng
Peking University
MLLMVideo LLMVisual GroundingImage/Video Segmentation
Chaoran Feng
Chaoran Feng
πŸŽ“ Peking University
3D VisionEvent-based VisionmLLM/VLM
J
Jiashu Yang
Dalian University of Technology, China
Pengchong Qiao
Pengchong Qiao
Peking University
Chang Liu
Chang Liu
Tsinghua University
HCI
Jie Chen
Jie Chen
Peking University
computer visiondeep learningmedical image analysis