Classic4Children: Adapting Chinese Literary Classics for Children with Large Language Model

📅 2025-02-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the challenge of children’s limited comprehension of classical Chinese literature due to its archaic language and intricate narrative structures, proposing the novel task of Child-Friendly Literary Adaptation (CLA). Methodologically, it introduces the first integrated framework combining character personality modeling, narrative structure guidance, and a readability reward mechanism, augmented by a lookahead decoding strategy. The technical pipeline encompasses fine-grained instruction tuning, structured prompt engineering, reinforcement learning for human preference alignment, and a custom readability evaluator. Experiments on our bilingual Classic4Children dataset—curated from the Four Great Classical Novels—demonstrate substantial improvements in both automated and human evaluations: generated adaptations exhibit strong child appeal, coherent logic, linguistic appropriateness for upper elementary readers (Grades 4–6), and embedded moral, historical, and humanistic educational value.

Technology Category

Application Category

📝 Abstract
Chinese literary classics hold significant cultural and educational value, offering deep insights into morality, history, and human nature. These works often include classical Chinese and complex narratives, making them difficult for children to read. To bridge this gap, we introduce a child-friendly literary adaptation (CLA) task to adapt the Chinese literary classic into engaging and accessible text for children. However, recent large language models (LLMs) overlook children's reading preferences (ie, vivid character portrayals, concise narrative structures, and appropriate readability), which poses challenges in CLA. In this paper, we propose a method called InstructChild, which augments the LLM with these preferences for adaptation. Specifically, we first obtain the characters' personalities and narrative structure as additional information for fine-grained instruction tuning. Then, we devise a readability metric as the reward to align the LLM with the children's reading level. Finally, a lookahead decoding strategy is applied to improve the readability of the generated text during inference. To support the evaluation of CLA task, we construct the Classic4Children dataset, which comprises both the original and child-friendly versions of the Four Great Classical Novels of Chinese literature. Experimental results show that our InstructChild significantly improves automatic and human evaluation performance.
Problem

Research questions and friction points this paper is trying to address.

Language Models
Children's Literature
Adaptation of Classic Literature
Innovation

Methods, ideas, or system contributions that make the work stand out.

InstructChild method
Children's literature adaptation
Large language model simplification
🔎 Similar Papers
No similar papers found.
Jiali Chen
Jiali Chen
Apple
Machine Learning
X
Xusen Hei
Key Laboratory of Big Data and Intelligent Robot (South China University of Technology) Ministry of Education, School of Software Engineering, South China University of Technology
Yuqi Xue
Yuqi Xue
University of Illinois Urbana-Champaign
Computer Architecture
Z
Zihan Wu
Key Laboratory of Big Data and Intelligent Robot (South China University of Technology) Ministry of Education, School of Software Engineering, South China University of Technology
J
Jiayuan Xie
The Hong Kong Polytechnic University
Y
Yi Cai
Key Laboratory of Big Data and Intelligent Robot (South China University of Technology) Ministry of Education, School of Software Engineering, South China University of Technology