Distilled-3DGS:Distilled 3D Gaussian Splatting

📅 2025-08-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the excessive memory and storage overhead in 3D Gaussian Splatting (3DGS) for novel-view synthesis—caused by its large number of Gaussians—this paper proposes the first knowledge distillation framework tailored for 3DGS. Our method introduces a novel multi-teacher ensemble distillation scheme, where teacher diversity is enhanced via noise injection and Dropout regularization. We further design a structural similarity loss to preserve geometric distribution consistency between teachers and student, and jointly optimize the lightweight student model using both this loss and standard rendering losses. Experiments on multiple benchmark datasets demonstrate that our approach reduces model storage by 40–60% while maintaining rendering quality competitive with state-of-the-art methods. Moreover, it significantly improves inference efficiency and deployment feasibility, enabling practical applications of 3DGS on resource-constrained platforms.

Technology Category

Application Category

📝 Abstract
3D Gaussian Splatting (3DGS) has exhibited remarkable efficacy in novel view synthesis (NVS). However, it suffers from a significant drawback: achieving high-fidelity rendering typically necessitates a large number of 3D Gaussians, resulting in substantial memory consumption and storage requirements. To address this challenge, we propose the first knowledge distillation framework for 3DGS, featuring various teacher models, including vanilla 3DGS, noise-augmented variants, and dropout-regularized versions. The outputs of these teachers are aggregated to guide the optimization of a lightweight student model. To distill the hidden geometric structure, we propose a structural similarity loss to boost the consistency of spatial geometric distributions between the student and teacher model. Through comprehensive quantitative and qualitative evaluations across diverse datasets, the proposed Distilled-3DGS, a simple yet effective framework without bells and whistles, achieves promising rendering results in both rendering quality and storage efficiency compared to state-of-the-art methods. Project page: https://distilled3dgs.github.io . Code: https://github.com/lt-xiang/Distilled-3DGS .
Problem

Research questions and friction points this paper is trying to address.

Reducing memory consumption in 3D Gaussian Splatting rendering
Improving storage efficiency while maintaining high-fidelity rendering
Distilling knowledge from teacher to student models for optimization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Knowledge distillation framework for 3DGS
Structural similarity loss for geometric consistency
Lightweight student model optimization
🔎 Similar Papers
No similar papers found.
L
Lintao Xiang
The University of Manchester
X
Xinkai Chen
Vision, Graphics, and X Group, Great Bay University
Jianhuang Lai
Jianhuang Lai
Sun Yat-sen University
Guangcong Wang
Guangcong Wang
Assistant Professor, Great Bay University
Machine LearningDeep Learning3D VisionAI4Science