When AI Settles Down: Late-Stage Stability as a Signature of AI-Generated Text Detection

๐Ÿ“… 2026-01-08
๐Ÿ›๏ธ arXiv.org
๐Ÿ“ˆ Citations: 1
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work addresses a critical limitation in existing zero-shot AI text detection methods, which overlook the temporal dynamics inherent in autoregressive generation and consequently struggle to distinguish human- from AI-authored text. Through analysis of over 120,000 samples, the study revealsโ€”for the first timeโ€”that log-probability fluctuations in AI-generated text exhibit significant attenuation in later generation stages. Leveraging this insight, the authors propose two lightweight statistical features derived exclusively from the latter half of a text: derivative dispersion and local volatility. Requiring neither perturbation-based sampling nor access to additional models, the method achieves state-of-the-art performance on the EvoBench and MAGE benchmarks and demonstrates strong complementarity with existing global approaches. Empirical results show that the volatility in the second half of AI-generated text is 24%โ€“32% lower than that of human-written text.

Technology Category

Application Category

๐Ÿ“ Abstract
Zero-shot detection methods for AI-generated text typically aggregate token-level statistics across entire sequences, overlooking the temporal dynamics inherent to autoregressive generation. We analyze over 120k text samples and reveal Late-Stage Volatility Decay: AI-generated text exhibits rapidly stabilizing log probability fluctuations as generation progresses, while human writing maintains higher variability throughout. This divergence peaks in the second half of sequences, where AI-generated text shows 24--32\% lower volatility. Based on this finding, we propose two simple features: Derivative Dispersion and Local Volatility, which computed exclusively from late-stage statistics. Without perturbation sampling or additional model access, our method achieves state-of-the-art performance on EvoBench and MAGE benchmarks and demonstrates strong complementarity with existing global methods.
Problem

Research questions and friction points this paper is trying to address.

AI-generated text detection
temporal dynamics
late-stage stability
zero-shot detection
autoregressive generation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Late-Stage Volatility Decay
Zero-shot Detection
Derivative Dispersion
Local Volatility
AI-Generated Text Detection
๐Ÿ”Ž Similar Papers
2024-06-21Journal of Artificial Intelligence ResearchCitations: 6