🤖 AI Summary
This work addresses content-style disentanglement modeling from a single input image. We propose the first end-to-end framework for jointly learning two LoRA modules: one dedicated to encoding semantic content and the other to artistic style. To ensure semantic separability, additivity, and disentanglement between the modules, we introduce a prompt separation mechanism and spatial orthogonality regularization—applied at the weight column and block levels. The resulting co-training framework enables high-fidelity original image reconstruction, cross-compositional generation, and fine-grained editing. Extensive evaluations—including human studies and quantitative metrics (FID, LPIPS)—demonstrate significant improvements over state-of-the-art single-image adaptation methods, including DreamBooth-LoRA, Inspiration Tree, and B-LoRA. Our approach validates the effectiveness and generalizability of content-style disentanglement under single-image supervision.
📝 Abstract
This paper introduces UnZipLoRA, a method for decomposing an image into its constituent subject and style, represented as two distinct LoRAs (Low-Rank Adaptations). Unlike existing personalization techniques that focus on either subject or style in isolation, or require separate training sets for each, UnZipLoRA disentangles these elements from a single image by training both the LoRAs simultaneously. UnZipLoRA ensures that the resulting LoRAs are compatible, i.e., they can be seamlessly combined using direct addition. UnZipLoRA enables independent manipulation and recontextualization of subject and style, including generating variations of each, applying the extracted style to new subjects, and recombining them to reconstruct the original image or create novel variations. To address the challenge of subject and style entanglement, UnZipLoRA employs a novel prompt separation technique, as well as column and block separation strategies to accurately preserve the characteristics of subject and style, and ensure compatibility between the learned LoRAs. Evaluation with human studies and quantitative metrics demonstrates UnZipLoRA's effectiveness compared to other state-of-the-art methods, including DreamBooth-LoRA, Inspiration Tree, and B-LoRA.