DF-LoGiT: Data-Free Logic-Gated Backdoor Attacks in Vision Transformers

πŸ“… 2026-02-03
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the risk of supply chain backdoor attacks against Vision Transformers (ViTs) in third-party model repositories. While existing approaches typically rely on fine-tuning with poisoned or synthetic data, this paper proposes the first truly data-free backdoor attack method. By directly manipulating model weights and exploiting the inherent multi-head attention architecture of ViTs, the method constructs logic-gated composite triggers without introducing additional modules or requiring fine-tuning. The approach achieves near-perfect attack success rates across multiple benchmarks while exerting negligible impact on the model’s performance on clean tasks. Furthermore, it demonstrates strong robustness against a variety of general and ViT-specific defense mechanisms, significantly enhancing both the practicality and stealth of backdoor attacks in real-world deployment scenarios.

Technology Category

Application Category

πŸ“ Abstract
The widespread adoption of Vision Transformers (ViTs) elevates supply-chain risk on third-party model hubs, where an adversary can implant backdoors into released checkpoints. Existing ViT backdoor attacks largely rely on poisoned-data training, while prior data-free attempts typically require synthetic-data fine-tuning or extra model components. This paper introduces Data-Free Logic-Gated Backdoor Attacks (DF-LoGiT), a truly data-free backdoor attack on ViTs via direct weight editing. DF-LoGiT exploits ViT's native multi-head architecture to realize a logic-gated compositional trigger, enabling a stealthy and effective backdoor. We validate its effectiveness through theoretical analysis and extensive experiments, showing that DF-LoGiT achieves near-100% attack success with negligible degradation in benign accuracy and remains robust against representative classical and ViT-specific defenses.
Problem

Research questions and friction points this paper is trying to address.

backdoor attack
Vision Transformers
data-free
supply-chain security
weight editing
Innovation

Methods, ideas, or system contributions that make the work stand out.

Data-Free Backdoor
Vision Transformers
Weight Editing
Logic-Gated Trigger
Supply-Chain Attack
πŸ”Ž Similar Papers
No similar papers found.
X
Xiaozuo Shen
University of Arizona
Y
Yifei Cai
Iowa State University
R
R. Ning
Old Dominion University
C
Chunsheng Xin
Iowa State University
Hongyi Wu
Hongyi Wu
IEEE Fellow, Professor and Department Head, ECE, The University of Arizona
Intelligent and Secure Computing and Communication Systems