Supervised Fine Tuning of Large Language Models for Domain Specific Knowledge Graph Construction:A Case Study on Hunan's Historical Celebrities

📅 2025-11-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing large language models (LLMs) exhibit limited capability in knowledge extraction and structured generation for Hunan’s historical figures due to data scarcity and insufficient domain-specific cultural knowledge. Method: We propose an instruction-tuning framework tailored to low-resource regional cultures, featuring a Xiangchuang-culture-pattern-guided instruction template, a fine-grained domain-specific instruction dataset, and parameter-efficient fine-tuning (PEFT) applied to Qwen2.5-7B, Qwen3-8B, DeepSeek-R1-Distill-Qwen-7B, and Llama-3.1-8B-Instruct. Contribution/Results: A domain-specific evaluation benchmark is established. Experimental results show that Qwen3-8B achieves 89.39 points under a 100-sample, 50-epoch training regime—significantly outperforming baseline models. This work provides a reusable methodology and practical paradigm for lightweight, fine-grained construction of cultural heritage knowledge graphs.

Technology Category

Application Category

📝 Abstract
Large language models and knowledge graphs offer strong potential for advancing research on historical culture by supporting the extraction, analysis, and interpretation of cultural heritage. Using Hunan's modern historical celebrities shaped by Huxiang culture as a case study, pre-trained large models can help researchers efficiently extract key information, including biographical attributes, life events, and social relationships, from textual sources and construct structured knowledge graphs. However, systematic data resources for Hunan's historical celebrities remain limited, and general-purpose models often underperform in domain knowledge extraction and structured output generation in such low-resource settings. To address these issues, this study proposes a supervised fine-tuning approach for enhancing domain-specific information extraction. First, we design a fine-grained, schema-guided instruction template tailored to the Hunan historical celebrities domain and build an instruction-tuning dataset to mitigate the lack of domain-specific training corpora. Second, we apply parameter-efficient instruction fine-tuning to four publicly available large language models - Qwen2.5-7B, Qwen3-8B, DeepSeek-R1-Distill-Qwen-7B, and Llama-3.1-8B-Instruct - and develop evaluation criteria for assessing their extraction performance. Experimental results show that all models exhibit substantial performance gains after fine-tuning. Among them, Qwen3-8B achieves the strongest results, reaching a score of 89.3866 with 100 samples and 50 training iterations. This study provides new insights into fine-tuning vertical large language models for regional historical and cultural domains and highlights their potential for cost-effective applications in cultural heritage knowledge extraction and knowledge graph construction.
Problem

Research questions and friction points this paper is trying to address.

Limited systematic data resources for Hunan's historical celebrities
General-purpose models underperform in domain knowledge extraction
Structured output generation challenges in low-resource settings
Innovation

Methods, ideas, or system contributions that make the work stand out.

Supervised fine-tuning enhances domain-specific information extraction
Schema-guided instruction template mitigates training data scarcity
Parameter-efficient fine-tuning boosts knowledge graph construction performance
🔎 Similar Papers
No similar papers found.
J
Junjie Hao
College of Information Science and Engineering, Hunan Normal University, Changsha, 410081, Hunan, China
Chun Wang
Chun Wang
College of Information Science and Engineering, Hunan Normal University, Changsha, 410081, Hunan, China; Hunan Provincial Key Laboratory of Philosophy and Social Sciences of Yuelushan Cultural and Digital Communication (Artificial Intelligence and International Communication AIIC), Hunan Normal University, Changsha, 410081, Hunan, China
Ying Qiao
Ying Qiao
Hunan Provincial Key Laboratory of Philosophy and Social Sciences of Yuelushan Cultural and Digital Communication (Artificial Intelligence and International Communication AIIC), Hunan Normal University, Changsha, 410081, Hunan, China; College of Computer Science and Electronic Engineering, Hunan University, Changsha, 410081, Hunan, China
Q
Qiuyue Zuo
College of Information Science and Engineering, Hunan Normal University, Changsha, 410081, Hunan, China
Q
Qiya Song
College of Information Science and Engineering, Hunan Normal University, Changsha, 410081, Hunan, China
Hua Ma
Hua Ma
College of Information Science and Engineering, Hunan Normal University
Service computingCloud computingDistributed computingRecommender systemCrowd-based cooperative computing
X
Xieping Gao
College of Information Science and Engineering, Hunan Normal University, Changsha, 410081, Hunan, China; Hunan Provincial Key Laboratory of Philosophy and Social Sciences of Yuelushan Cultural and Digital Communication (Artificial Intelligence and International Communication AIIC), Hunan Normal University, Changsha, 410081, Hunan, China