SoT: Structured-of-Thought Prompting Guides Multilingual Reasoning in Large Language Models

📅 2025-10-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the significant degradation in complex reasoning capabilities of large language models (LLMs) for low-resource languages compared to English, this paper proposes Structured-of-Thought (SoT), a training-free method. SoT bridges linguistic disparities via two core mechanisms: language-specific semantic transformation and structured knowledge conversion—mapping language-dependent semantics into language-agnostic, structured intermediate representations. Crucially, it introduces the first structured thinking prompting mechanism to construct cross-lingually stable, multi-step reasoning paths. SoT is architecture-agnostic, requires no fine-tuning or additional parameters, and seamlessly integrates with existing training-free strategies. Evaluated on multiple multilingual reasoning benchmarks—including XLogic, MGSM, and XCOPA—SoT consistently outperforms strong baselines by an average of +8.2%. Its orthogonality to prior zero-shot methods enables further performance gains, establishing a new state-of-the-art for multilingual reasoning without parameter updates.

Technology Category

Application Category

📝 Abstract
Recent developments have enabled Large Language Models (LLMs) to engage in complex reasoning tasks through deep thinking. However, the capacity of reasoning has not been successfully transferred to non-high-resource languages due to resource constraints, which struggles with multilingual reasoning tasks. To this end, we propose Structured-of-Thought (SoT), a training-free method that improves the performance on multilingual reasoning through a multi-step transformation: Language Thinking Transformation and Structured Knowledge Transformation. The SoT method converts language-specific semantic information into language-agnostic structured representations, enabling the models to understand the query in different languages more sophisticated. Besides, SoT effectively guides LLMs toward more concentrated reasoning to maintain consistent underlying reasoning pathways when handling cross-lingual variations in expression. Experimental results demonstrate that SoT outperforms several strong baselines on multiple multilingual reasoning benchmarks when adapting to various backbones of LLMs. It can also be integrated with other training-free strategies for further improvements. Our code is available at https://github.com/Cherry-qwq/SoT.
Problem

Research questions and friction points this paper is trying to address.

Improves multilingual reasoning in large language models
Transforms semantic information into language-agnostic representations
Maintains consistent reasoning pathways across different languages
Innovation

Methods, ideas, or system contributions that make the work stand out.

SoT converts semantic information into structured representations
SoT guides LLMs toward concentrated cross-lingual reasoning
SoT integrates with other training-free strategies for improvements
🔎 Similar Papers
No similar papers found.
R
Rui Qi
Key Laboratory of Big Data & Artificial Intelligence in Transportation (Beijing Jiaotong University), Ministry of Education; School of Computer Science and Technology, Beijing Jiaotong University
Z
Zhibo Man
Key Laboratory of Big Data & Artificial Intelligence in Transportation (Beijing Jiaotong University), Ministry of Education; School of Computer Science and Technology, Beijing Jiaotong University
Y
Yufeng Chen
Key Laboratory of Big Data & Artificial Intelligence in Transportation (Beijing Jiaotong University), Ministry of Education; School of Computer Science and Technology, Beijing Jiaotong University
Fengran Mo
Fengran Mo
Ph.D. Student, Université de Montréal
Conversational AIInformation RetrievalNatural Language ProcessingMultilingualism
Jinan Xu
Jinan Xu
Professor of School of Computer and Information Technology, Beijing Jiaotong University
NLPMachine TranslationLLM
K
Kaiyu Huang
Key Laboratory of Big Data & Artificial Intelligence in Transportation (Beijing Jiaotong University), Ministry of Education; School of Computer Science and Technology, Beijing Jiaotong University