🤖 AI Summary
To address catastrophic forgetting in 3D Gaussian Splatting under continual data stream learning, this work introduces variational Bayesian inference into the framework for the first time. Leveraging the conjugacy between multivariate Gaussian priors and likelihoods, it derives closed-form parameter update rules, enabling truly online 3D reconstruction without replay buffers. The method integrates sequential observation modeling with differentiable rendering, supporting efficient, stable, and memory-efficient continual parameter evolution. Experiments demonstrate state-of-the-art accuracy on static benchmarks and significantly outperform gradient-backpropagation-based methods on 2D/3D streaming continual learning tasks—achieving faster convergence and markedly improved long-term memory stability. The core contribution is the establishment of the first variational continual learning paradigm specifically designed for Gaussian splatting.
📝 Abstract
Recently, 3D Gaussian Splatting has emerged as a promising approach for modeling 3D scenes using mixtures of Gaussians. The predominant optimization method for these models relies on backpropagating gradients through a differentiable rendering pipeline, which struggles with catastrophic forgetting when dealing with continuous streams of data. To address this limitation, we propose Variational Bayes Gaussian Splatting (VBGS), a novel approach that frames training a Gaussian splat as variational inference over model parameters. By leveraging the conjugacy properties of multivariate Gaussians, we derive a closed-form variational update rule, allowing efficient updates from partial, sequential observations without the need for replay buffers. Our experiments show that VBGS not only matches state-of-the-art performance on static datasets, but also enables continual learning from sequentially streamed 2D and 3D data, drastically improving performance in this setting.