🤖 AI Summary
This work addresses the vulnerability of text-to-image generation models to malicious prompts that induce harmful content, while existing activation-based intervention methods often degrade image quality under benign inputs. To mitigate this trade-off, we propose the Conditional Activation Transport (CAT) framework, which introduces— for the first time—a geometric conditioning mechanism coupled with a nonlinear activation mapping to enable precise intervention only when unsafe activation regions are detected, thereby preserving normal generation dynamics. We also construct SafeSteerDataset, the first contrastive dataset comprising paired safe and unsafe prompts, to facilitate training and evaluation. Experiments demonstrate that CAT substantially reduces attack success rates on both Z-Image and Infinity models while maintaining image fidelity comparable to that of the original, unmodified generators.
📝 Abstract
Despite their impressive capabilities, current Text-to-Image (T2I) models remain prone to generating unsafe and toxic content. While activation steering offers a promising inference-time intervention, we observe that linear activation steering frequently degrades image quality when applied to benign prompts. To address this trade-off, we first construct SafeSteerDataset, a contrastive dataset containing 2300 safe and unsafe prompt pairs with high cosine similarity. Leveraging this data, we propose Conditioned Activation Transport (CAT), a framework that employs a geometry-based conditioning mechanism and nonlinear transport maps. By conditioning transport maps to activate only within unsafe activation regions, we minimize interference with benign queries. We validate our approach on two state-of-the-art architectures: Z-Image and Infinity. Experiments demonstrate that CAT generalizes effectively across these backbones, significantly reducing Attack Success Rate while maintaining image fidelity compared to unsteered generations. Warning: This paper contains potentially offensive text and images.