Latent Iterative Refinement Flow: A Geometric-Constrained Approach for Few-Shot Generation

📅 2025-09-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Few-shot generative modeling suffers from overfitting and mode collapse, and existing approaches struggle to jointly preserve latent-space geometric structure and ensure high-fidelity generation. To address this, we propose a “Generate–Correct–Enhance” iterative framework. First, we introduce a manifold-preserving loss to construct a semantically consistent latent manifold. Second, we design a geometric correction operator based on contraction mapping, which theoretically guarantees monotonic reduction of the Hausdorff distance between the generated and true data manifolds. Third, we integrate iterative refinement with structural-aware data augmentation to enable geometry-informed sample evolution. Evaluated on benchmarks including AFHQ-Cat, our method generates high-resolution, high-fidelity, and diverse images, substantially outperforming state-of-the-art few-shot generative models. Ablation studies confirm the critical contribution of each component.

Technology Category

Application Category

📝 Abstract
Few-shot generation, the synthesis of high-quality and diverse samples from limited training data, remains a significant challenge in generative modeling. Existing methods trained from scratch often fail to overcome overfitting and mode collapse, and fine-tuning large models can inherit biases while neglecting the crucial geometric structure of the latent space. To address these limitations, we introduce Latent Iterative Refinement Flow (LIRF), a novel approach that reframes few-shot generation as the progressive densification of geometrically structured manifold. LIRF establishes a stable latent space using an autoencoder trained with our novel extbf{manifold-preservation loss} $L_{ ext{manifold}}$. This loss ensures that the latent space maintains the geometric and semantic correspondence of the input data. Building on this, we propose an iterative generate-correct-augment cycle. Within this cycle, candidate samples are refined by a geometric extbf{correction operator}, a provably contractive mapping that pulls samples toward the data manifold while preserving diversity. We also provide the extbf{Convergence Theorem} demonstrating a predictable decrease in Hausdorff distance between generated and true data manifold. We also demonstrate the framework's scalability by generating coherent, high-resolution images on AFHQ-Cat. Ablation studies confirm that both the manifold-preserving latent space and the contractive correction mechanism are critical components of this success. Ultimately, LIRF provides a solution for data-scarce generative modeling that is not only theoretically grounded but also highly effective in practice.
Problem

Research questions and friction points this paper is trying to address.

Overcoming overfitting and mode collapse in few-shot generation from limited data
Addressing geometric structure neglect in latent space during fine-tuning
Progressive densification of geometrically structured manifold for data-scarce generation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Manifold-preservation loss maintains geometric latent structure
Iterative correction operator contracts samples toward manifold
Convergence Theorem guarantees decreasing Hausdorff distance
🔎 Similar Papers
No similar papers found.
S
Songtao Li
School of Mathematics and Statistics, Huazhong University of Science and Technology, Wuhan 430074, China
Z
Zhenyu Liao
School of Electronic Information and Communications, Huazhong University of Science and Technology, Wuhan, China
Tianqi Hou
Tianqi Hou
Theory Lab, Central Research Institute, 2012 Labs, Huawei Technologies Co., Ltd.
statistical physicsmachine learning,high-dimensional statisticsComputational Neuroscience
Ting Gao
Ting Gao
Huazhong University of Science and Technology
Stochastic Dynamical SystemDeep LearningBrain ScienceQuantitative Finance