Reason Analogically via Cross-domain Prior Knowledge: An Empirical Study of Cross-domain Knowledge Transfer for In-Context Learning

📅 2026-04-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the performance limitations of in-context learning (ICL) in target domains lacking labeled examples. The authors propose enhancing ICL by transferring shared reasoning structures from cross-domain examples, revealing the existence of an “example absorption threshold” that governs successful positive transfer across domains. They demonstrate that performance gains primarily stem from the repair of reasoning structures rather than the incorporation of semantic cues. Through multiple retrieval strategies for selecting cross-domain examples, experiments show that when the absorption threshold is met, such examples significantly improve reasoning performance in the target domain. These findings provide both theoretical grounding and practical guidance for efficient example retrieval and effective knowledge transfer in ICL settings.
📝 Abstract
Despite its success, existing in-context learning (ICL) relies on in-domain expert demonstrations, limiting its applicability when expert annotations are scarce. We posit that different domains may share underlying reasoning structures, enabling source-domain demonstrations to improve target-domain inference despite semantic mismatch. To test this hypothesis, we conduct a comprehensive empirical study of different retrieval methods to validate the feasibility of achieving cross-domain knowledge transfer under the in-context learning setting. Our results demonstrate conditional positive transfer in cross-domain ICL. We identify a clear example absorption threshold: beyond it, positive transfer becomes more likely, and additional demonstrations yield larger gains. Further analysis suggests that these gains stem from reasoning structure repair by retrieved cross-domain examples, rather than semantic cues. Overall, our study validates the feasibility of leveraging cross-domain knowledge transfer to improve cross-domain ICL performance, motivating the community to explore designing more effective retrieval approaches for this novel direction.\footnote{Our implementation is available at https://github.com/littlelaska/ICL-TF4LR}
Problem

Research questions and friction points this paper is trying to address.

in-context learning
cross-domain knowledge transfer
reasoning structure
demonstration scarcity
semantic mismatch
Innovation

Methods, ideas, or system contributions that make the work stand out.

cross-domain knowledge transfer
in-context learning
reasoning structure
example absorption threshold
analogical reasoning
🔎 Similar Papers
No similar papers found.
Le Liu
Le Liu
Northwestern Polytechnical University
VisualizationComputer GraphicsComputer VisionAI
Zhiming Li
Zhiming Li
Central South University
Materials designMaterials processingMicrostructureMaterials PropertiesPhysical Metallurgy
J
Jianzhi Yan
Harbin Institute of Technology, Shenzhen, China; Pengcheng Laboratory, Shenzhen, China
Z
Zike Yuan
Harbin Institute of Technology, Shenzhen, China; Pengcheng Laboratory, Shenzhen, China
S
Shiwei Chen
Harbin Institute of Technology, Shenzhen, China; Pengcheng Laboratory, Shenzhen, China
Y
Youcheng Pan
Pengcheng Laboratory, Shenzhen, China
B
Buzhou Tang
Harbin Institute of Technology, Shenzhen, China; Pengcheng Laboratory, Shenzhen, China
Qingcai Chen
Qingcai Chen
Professor, Harbin Institute of Technology (Shenzhen); Peng Cheng Laboratory
Natural Language ProcessingMachine LearningInformation RetrievalHuman-Machine Interaction
Yang Xiang
Yang Xiang
Associate Professor of Peng Cheng Laboratory, China
artificial intelligencepredictive modelingmachine learningnatural language processing
D
Danny Dongning Sun
Pengcheng Laboratory, Shenzhen, China