Convergence and Stability Analysis of Self-Consuming Generative Models with Heterogeneous Human Curation

📅 2025-11-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper investigates the convergence and stability of self-consuming generative models under heterogeneous human preferences. Addressing the dynamic evolution induced by multi-round retraining on “real–generated” data mixtures, we develop a novel analytical framework integrating nonlinear Perron–Frobenius theory with dynamical systems methods. Going beyond the classical Banach contraction mapping paradigm, we establish rigorous convergence guarantees under non-standard assumptions. We systematically characterize asymptotic behaviors across four distinct preference regimes and derive universal convergence conditions and stability criteria. Our results reveal how preference heterogeneity fundamentally governs stability versus instability mechanisms in self-consuming learning dynamics. Moreover, they substantially broaden the theoretical applicability of self-consuming models, providing critical foundations for controllable and robust deployment of generative AI systems.

Technology Category

Application Category

📝 Abstract
Self-consuming generative models have received significant attention over the last few years. In this paper, we study a self-consuming generative model with heterogeneous preferences that is a generalization of the model in Ferbach et al. (2024). The model is retrained round by round using real data and its previous-round synthetic outputs. The asymptotic behavior of the retraining dynamics is investigated across four regimes using different techniques including the nonlinear Perron--Frobenius theory. Our analyses improve upon that of Ferbach et al. (2024) and provide convergence results in settings where the well-known Banach contraction mapping arguments do not apply. Stability and non-stability results regarding the retraining dynamics are also given.
Problem

Research questions and friction points this paper is trying to address.

Analyzes convergence of self-consuming generative models with human curation
Studies retraining dynamics using real data and synthetic outputs
Provides stability results where contraction mapping arguments fail
Innovation

Methods, ideas, or system contributions that make the work stand out.

Self-consuming generative models with heterogeneous preferences
Retraining dynamics analyzed using nonlinear Perron-Frobenius theory
Convergence results beyond Banach contraction mapping arguments
🔎 Similar Papers
No similar papers found.
H
Hongru Zhao
School of Statistics, University of Minnesota
J
Jinwen Fu
School of Statistics, University of Minnesota
Tuan Pham
Tuan Pham
University òf California, Irvine
Machine LearningComputer Vision