🤖 AI Summary
This study addresses the prevalent issues of homogenized narrative structures, repetitive plotlines, and stereotypical endings in Chinese web novels generated by large language models (LLMs). To tackle this, the work proposes the first extended framework of 34 narrative functions tailored to contemporary Chinese web fiction, grounded in Proppian narratology. Leveraging a manually annotated corpus, the authors conduct a systematic qualitative and quantitative analysis of LLM-generated texts to examine their narrative logic. The findings reveal that LLMs, due to an inadequate grasp of the semantic nuances underlying narrative functions, tend to rely on fixed generative patterns, resulting in impoverished narrative diversity. This research establishes the first systematic analytical framework specifically designed for evaluating and enhancing the narrative capabilities of LLMs in the context of Chinese web literature.
📝 Abstract
Large Language Models (LLMs) have demonstrated remarkable capabilities in narrative generation. However, they often produce structurally homogenized stories, frequently following repetitive arrangements and combinations of plot events along with stereotypical resolutions. In this paper, we propose a novel theoretical framework for analysis by incorporating Proppian narratology and narrative functions. This framework is used to analyze the composition of narrative texts generated by LLMs to uncover their underlying narrative logic. Taking Chinese web literature as our research focus, we extend Propp's narrative theory, defining 34 narrative functions suited to modern web narrative structures. We further construct a human-annotated corpus to support the analysis of narrative structures within LLM-generated text. Experiments reveal that the primary reasons for the singular narrative logic and severe homogenization in generated texts are that current LLMs are unable to correctly comprehend the meanings of narrative functions and instead adhere to rigid narrative generation paradigms.