Diverse Rare Sample Generation with Pretrained GANs

📅 2024-12-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Pretrained GANs suffer from poor generation capability in low-density regions and are prone to mode collapse, hindering rare-sample synthesis. To address this, we propose a zero-shot, fine-tuning-free framework for generating rare images. Our method jointly optimizes latent variables via gradient-based search under a multi-objective loss—balancing rarity, diversity, and fidelity—and introduces, for the first time, normalized flows in feature space to explicitly model data density, enabling controllable rare-sample generation. Crucially, the framework requires no retraining and is directly applicable to arbitrary pretrained GANs. Extensive experiments across multiple datasets and GAN architectures demonstrate significant improvements: average FID reduction of 12.3%, LPIPS-based diversity gain of 18.7%, and rare-sample coverage increase of 31.5%. Qualitatively, it synthesizes high-fidelity, semantically novel images. This work establishes a new paradigm for few-shot and long-tailed image generation.

Technology Category

Application Category

📝 Abstract
Deep generative models are proficient in generating realistic data but struggle with producing rare samples in low density regions due to their scarcity of training datasets and the mode collapse problem. While recent methods aim to improve the fidelity of generated samples, they often reduce diversity and coverage by ignoring rare and novel samples. This study proposes a novel approach for generating diverse rare samples from high-resolution image datasets with pretrained GANs. Our method employs gradient-based optimization of latent vectors within a multi-objective framework and utilizes normalizing flows for density estimation on the feature space. This enables the generation of diverse rare images, with controllable parameters for rarity, diversity, and similarity to a reference image. We demonstrate the effectiveness of our approach both qualitatively and quantitatively across various datasets and GANs without retraining or fine-tuning the pretrained GANs.
Problem

Research questions and friction points this paper is trying to address.

High-fidelity Generation
GANs
Data scarcity and mode collapse
Innovation

Methods, ideas, or system contributions that make the work stand out.

Pre-trained GANs
Mathematical Adjustment
Rare and Diverse Instance Generation