TableReasoner: Advancing Table Reasoning Framework with Large Language Models

📅 2025-07-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Real-world table question answering (TQA) faces challenges including large-scale tables, incomplete column semantics, and entity ambiguity. Method: This paper proposes a novel framework integrating large language models (LLMs) with programmatic reasoning. It (1) introduces a multi-step schema linking strategy that dynamically generates structure-focused table representations to mitigate semantic ambiguity; (2) designs an iterative “Think–Reason–Reflect” architecture for joint structural and semantic modeling; and (3) incorporates LLM-driven programmable reasoning to generate interpretable, executable SQL-like queries. Contribution/Results: The framework achieves first place on both subtasks of SemEval-2025 Task 8, significantly improving reasoning accuracy and robustness on complex, large-scale, and low-quality tables. It establishes a new paradigm for semantic understanding of real-world tabular data characterized by structural sparsity and lexical ambiguity.

Technology Category

Application Category

📝 Abstract
The paper presents our system developed for table question answering (TQA). TQA tasks face challenges due to the characteristics of real-world tabular data, such as large size, incomplete column semantics, and entity ambiguity. To address these issues, we propose a large language model (LLM)-powered and programming-based table reasoning framework, named TableReasoner. It models a table using the schema that combines structural and semantic representations, enabling holistic understanding and efficient processing of large tables. We design a multi-step schema linking plan to derive a focused table schema that retains only query-relevant information, eliminating ambiguity and alleviating hallucinations. This focused table schema provides precise and sufficient table details for query refinement and programming. Furthermore, we integrate the reasoning workflow into an iterative thinking architecture, allowing incremental cycles of thinking, reasoning and reflection. Our system achieves first place in both subtasks of SemEval-2025 Task 8.
Problem

Research questions and friction points this paper is trying to address.

Addressing challenges in table question answering due to large, ambiguous data
Proposing a framework for holistic table understanding using LLMs
Enhancing reasoning with multi-step schema linking and iterative workflows
Innovation

Methods, ideas, or system contributions that make the work stand out.

LLM-powered programming-based table reasoning framework
Multi-step schema linking for focused table schema
Iterative thinking architecture for reasoning workflow
🔎 Similar Papers
No similar papers found.
S
Sishi Xiong
Institute of Artificial Intelligence (TeleAI), China Telecom Corp Ltd
D
Dakai Wang
Institute of Artificial Intelligence (TeleAI), China Telecom Corp Ltd
Y
Yu Zhao
Institute of Artificial Intelligence (TeleAI), China Telecom Corp Ltd
J
Jie Zhang
Institute of Artificial Intelligence (TeleAI), China Telecom Corp Ltd
C
Changzai Pan
Institute of Artificial Intelligence (TeleAI), China Telecom Corp Ltd
Haowei He
Haowei He
Tsinghua University
machine learningdeep learning
X
Xiangyu Li
Institute of Artificial Intelligence (TeleAI), China Telecom Corp Ltd
W
Wenhan Chang
Institute of Artificial Intelligence (TeleAI), China Telecom Corp Ltd
Z
Zhongjiang He
Institute of Artificial Intelligence (TeleAI), China Telecom Corp Ltd
S
Shuangyong Song
Institute of Artificial Intelligence (TeleAI), China Telecom Corp Ltd
Yongxiang Li
Yongxiang Li
Professor, RMIT University
Electronic Materials and Devices