GaussianImage++: Boosted Image Representation and Compression with 2D Gaussian Splatting

πŸ“… 2025-12-22
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
To address the redundancy of Gaussian primitives and the trade-off between fidelity and compression in 2D Gaussian splatting (GS) image representations, this paper proposes an efficient implicit image representation. Our method introduces: (1) a distortion-driven progressive density growth mechanism that dynamically spawns Gaussians according to signal importance; (2) content-adaptive Gaussian filters to enhance local structural reconstruction; and (3) an attribute-separate learnable scalar quantizer with quantization-aware training for compact encoding and end-to-end optimization. Experiments demonstrate that our approach achieves superior PSNR and SSIM over GaussianImage and INR-based COIN using fewer than 1,000 Gaussians. It enables real-time decoding and low memory footprint, striking a significant balance between image compression efficiency and implicit representational fidelity.

Technology Category

Application Category

πŸ“ Abstract
Implicit neural representations (INRs) have achieved remarkable success in image representation and compression, but they require substantial training time and memory. Meanwhile, recent 2D Gaussian Splatting (GS) methods ( extit{e.g.}, GaussianImage) offer promising alternatives through efficient primitive-based rendering. However, these methods require excessive Gaussian primitives to maintain high visual fidelity. To exploit the potential of GS-based approaches, we present GaussianImage++, which utilizes limited Gaussian primitives to achieve impressive representation and compression performance. Firstly, we introduce a distortion-driven densification mechanism. It progressively allocates Gaussian primitives according to signal intensity. Secondly, we employ context-aware Gaussian filters for each primitive, which assist in the densification to optimize Gaussian primitives based on varying image content. Thirdly, we integrate attribute-separated learnable scalar quantizers and quantization-aware training, enabling efficient compression of primitive attributes. Experimental results demonstrate the effectiveness of our method. In particular, GaussianImage++ outperforms GaussianImage and INRs-based COIN in representation and compression performance while maintaining real-time decoding and low memory usage.
Problem

Research questions and friction points this paper is trying to address.

Reduces Gaussian primitives for high-fidelity image representation
Optimizes primitives with context-aware filters for varied content
Compresses attributes efficiently using quantization-aware training
Innovation

Methods, ideas, or system contributions that make the work stand out.

Distortion-driven densification mechanism for Gaussian primitives
Context-aware Gaussian filters optimize primitive allocation
Attribute-separated learnable scalar quantizers enable efficient compression
πŸ”Ž Similar Papers
No similar papers found.