🤖 AI Summary
This paper addresses the challenge in Schrödinger Bridge (SB) diffusion models where multiple backdoor triggers struggle to coexist and rely on restrictive input distribution assumptions. We propose the first heterogeneous multi-trigger backdoor injection method compatible with arbitrary input distributions—including image-to-image translation—without modifying the underlying stochastic differential equation (SDE) dynamics, relying solely on poisoned image-pair training. Our core contributions are: (1) a Divide-and-Merge hybrid bridge strategy that decouples task-specific backdoor paths; and (2) a Weight Redistribution Scheme (WRS) to mitigate interference among concurrent backdoors. Integrating Diffusion Schrödinger Bridge (DSB) modeling, geometric-mean distribution priors, and dynamic weight fusion, our approach achieves precise multi-trigger activation, high-fidelity reconstruction, and strong stealth across diverse generative tasks—significantly outperforming both single-trigger and existing SB-based backdoor methods.
📝 Abstract
This paper focuses on implanting multiple heterogeneous backdoor triggers in bridge-based diffusion models designed for complex and arbitrary input distributions. Existing backdoor formulations mainly address single-attack scenarios and are limited to Gaussian noise input models. To fill this gap, we propose MixBridge, a novel diffusion Schr""odinger bridge (DSB) framework to cater to arbitrary input distributions (taking I2I tasks as special cases). Beyond this trait, we demonstrate that backdoor triggers can be injected into MixBridge by directly training with poisoned image pairs. This eliminates the need for the cumbersome modifications to stochastic differential equations required in previous studies, providing a flexible tool to study backdoor behavior for bridge models. However, a key question arises: can a single DSB model train multiple backdoor triggers? Unfortunately, our theory shows that when attempting this, the model ends up following the geometric mean of benign and backdoored distributions, leading to performance conflict across backdoor tasks. To overcome this, we propose a Divide-and-Merge strategy to mix different bridges, where models are independently pre-trained for each specific objective (Divide) and then integrated into a unified model (Merge). In addition, a Weight Reallocation Scheme (WRS) is also designed to enhance the stealthiness of MixBridge. Empirical studies across diverse generation tasks speak to the efficacy of MixBridge.