Asynchronous Event Error-Minimizing Noise for Safeguarding Event Dataset

📅 2025-07-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the vulnerability of event camera data to unauthorized model training, this paper proposes the first unlearnable sample generation method tailored for asynchronous event streams. Our approach introduces a sparsity-aware noise projection strategy that injects adversarial perturbations via an asynchronous event error minimization mechanism—preserving the intrinsic sparsity of event data while steering illicit models to converge on noise patterns rather than genuine semantics. This work pioneers the extension of the unlearnable sample paradigm to spike-based event stream learning, uniquely balancing privacy protection and downstream task utility. Extensive experiments across multiple event-based recognition benchmarks demonstrate that our method reduces unauthorized training accuracy by over 80%, while degrading legitimate user performance by less than 2%.

Technology Category

Application Category

📝 Abstract
With more event datasets being released online, safeguarding the event dataset against unauthorized usage has become a serious concern for data owners. Unlearnable Examples are proposed to prevent the unauthorized exploitation of image datasets. However, it's unclear how to create unlearnable asynchronous event streams to prevent event misuse. In this work, we propose the first unlearnable event stream generation method to prevent unauthorized training from event datasets. A new form of asynchronous event error-minimizing noise is proposed to perturb event streams, tricking the unauthorized model into learning embedded noise instead of realistic features. To be compatible with the sparse event, a projection strategy is presented to sparsify the noise to render our unlearnable event streams (UEvs). Extensive experiments demonstrate that our method effectively protects event data from unauthorized exploitation, while preserving their utility for legitimate use. We hope our UEvs contribute to the advancement of secure and trustworthy event dataset sharing. Code is available at: https://github.com/rfww/uevs.
Problem

Research questions and friction points this paper is trying to address.

Prevent unauthorized training from event datasets
Generate unlearnable asynchronous event streams
Protect event data while preserving legitimate utility
Innovation

Methods, ideas, or system contributions that make the work stand out.

Asynchronous event error-minimizing noise generation
Sparse noise projection for event streams
Unlearnable event streams prevent unauthorized training
🔎 Similar Papers
No similar papers found.