SharpTimeGS: Sharp and Stable Dynamic Gaussian Splatting via Lifespan Modulation

📅 2026-02-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of novel view synthesis in dynamic scenes, where existing methods struggle to jointly maintain long-term stability in static regions and fine detail fidelity in dynamic regions under a unified representation. To this end, we propose a 4D Gaussian representation framework featuring learnable lifetime parameters, which enables temporally adaptive modeling through flat-top temporal visibility functions and lifetime-modulated motion mechanisms. Our approach introduces lifetime-aware temporal visibility reconstruction and motion decoupling strategies that effectively disentangle motion magnitude from temporal persistence. Furthermore, we design a lifetime- and velocity-aware adaptive densification scheme to enhance representation quality. Experiments demonstrate that our method achieves state-of-the-art performance across multiple benchmarks, enabling real-time rendering at 4K resolution and 100 FPS on a single RTX 4090 GPU.

Technology Category

Application Category

📝 Abstract
Novel view synthesis of dynamic scenes is fundamental to achieving photorealistic 4D reconstruction and immersive visual experiences. Recent progress in Gaussian-based representations has significantly improved real-time rendering quality, yet existing methods still struggle to maintain a balance between long-term static and short-term dynamic regions in both representation and optimization. To address this, we present SharpTimeGS, a lifespan-aware 4D Gaussian framework that achieves temporally adaptive modeling of both static and dynamic regions under a unified representation. Specifically, we introduce a learnable lifespan parameter that reformulates temporal visibility from a Gaussian-shaped decay into a flat-top profile, allowing primitives to remain consistently active over their intended duration and avoiding redundant densification. In addition, the learned lifespan modulates each primitives'motion, reducing drift in long-lived static points while retaining unrestricted motion for short-lived dynamic ones. This effectively decouples motion magnitude from temporal duration, improving long-term stability without compromising dynamic fidelity. Moreover, we design a lifespan-velocity-aware densification strategy that mitigates optimization imbalance between static and dynamic regions by allocating more capacity to regions with pronounced motion while keeping static areas compact and stable. Extensive experiments on multiple benchmarks demonstrate that our method achieves state-of-the-art performance while supporting real-time rendering up to 4K resolution at 100 FPS on one RTX 4090.
Problem

Research questions and friction points this paper is trying to address.

novel view synthesis
dynamic scenes
4D reconstruction
Gaussian splatting
temporal stability
Innovation

Methods, ideas, or system contributions that make the work stand out.

Gaussian Splatting
4D Reconstruction
Lifespan Modulation
Dynamic Scene Modeling
Temporal Visibility
🔎 Similar Papers
No similar papers found.