Poison-splat: Computation Cost Attack on 3D Gaussian Splatting

📅 2024-10-10
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work uncovers a previously overlooked computational security vulnerability in 3D Gaussian Splatting (3DGS) training: adversarial poisoning of input images can deliberately degrade its time and space complexity to the worst case, triggering GPU memory overflow (OOM) and service disruption—constituting a novel computational denial-of-service (DoS) attack. To this end, the authors propose the first poisoning attack paradigm targeting computational complexity, built upon a bilevel optimization framework that jointly integrates three strategies: target approximation, proxy-model rendering, and constraint-aware optimization. This enables efficient, stealthy, and transferable resource-exhaustion attacks. Experiments demonstrate that the attack consistently multiplies GPU memory consumption and reliably triggers OOM within standard 3DGS pipelines, maintaining high success rates across diverse initialization schemes and optimization configurations—thereby transcending the limitations of conventional accuracy-targeted adversarial attacks.

Technology Category

Application Category

📝 Abstract
3D Gaussian splatting (3DGS), known for its groundbreaking performance and efficiency, has become a dominant 3D representation and brought progress to many 3D vision tasks. However, in this work, we reveal a significant security vulnerability that has been largely overlooked in 3DGS: the computation cost of training 3DGS could be maliciously tampered by poisoning the input data. By developing an attack named Poison-splat, we reveal a novel attack surface where the adversary can poison the input images to drastically increase the computation memory and time needed for 3DGS training, pushing the algorithm towards its worst computation complexity. In extreme cases, the attack can even consume all allocable memory, leading to a Denial-of-Service (DoS) that disrupts servers, resulting in practical damages to real-world 3DGS service vendors. Such a computation cost attack is achieved by addressing a bi-level optimization problem through three tailored strategies: attack objective approximation, proxy model rendering, and optional constrained optimization. These strategies not only ensure the effectiveness of our attack but also make it difficult to defend with simple defensive measures. We hope the revelation of this novel attack surface can spark attention to this crucial yet overlooked vulnerability of 3DGS systems. Our code is available at https://github.com/jiahaolu97/poison-splat .
Problem

Research questions and friction points this paper is trying to address.

Reveals security vulnerability in 3D Gaussian splatting training.
Demonstrates how input data poisoning increases computation costs.
Introduces Poison-splat attack causing Denial-of-Service in 3DGS systems.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Poison-splat targets 3D Gaussian splatting vulnerabilities.
Uses bi-level optimization with three tailored strategies.
Increases computation cost, causing potential Denial-of-Service.
🔎 Similar Papers
No similar papers found.