Markovian Generation Chains in Large Language Models

📅 2026-03-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates the evolutionary dynamics of Chinese text generation by large language models under memoryless iterative conditions. We formalize this process as a sentence-level Markovian generation chain, where each step depends solely on a fixed prompt template and the output from the previous step. Through restatement and round-trip translation experiments, we analyze the emergent patterns of textual evolution. As the first work to model iterative generation as a Markov chain, we uncover the mechanisms by which temperature settings and initial inputs govern the trajectory of textual diversity: the iterative process either converges to a small cyclic set of sentences or continues producing novel outputs within a finite number of steps, with diversity exhibiting bidirectional trends—increasing or decreasing—depending on parameter configurations.

Technology Category

Application Category

📝 Abstract
The widespread use of large language models (LLMs) raises an important question: how do texts evolve when they are repeatedly processed by LLMs? In this paper, we define this iterative inference process as Markovian generation chains, where each step takes a specific prompt template and the previous output as input, without including any prior memory. In iterative rephrasing and round-trip translation experiments, the output either converges to a small recurrent set or continues to produce novel sentences over a finite horizon. Through sentence-level Markov chain modeling and analysis of simulated data, we show that iterative process can either increase or reduce sentence diversity depending on factors such as the temperature parameter and the initial input sentence. These results offer valuable insights into the dynamics of iterative LLM inference and their implications for multi-agent LLM systems.
Problem

Research questions and friction points this paper is trying to address.

Markovian generation chains
large language models
iterative inference
sentence diversity
text evolution
Innovation

Methods, ideas, or system contributions that make the work stand out.

Markovian generation chains
iterative inference
large language models
sentence diversity
multi-agent LLM systems