DETQUS: Decomposition-Enhanced Transformers for QUery-focused Summarization

📅 2025-03-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address inaccurate summarization in query-driven table-to-text generation caused by Transformer token limits and high inference complexity on large tables, this paper proposes a table decomposition–enhanced paradigm. First, an LLM performs query-aware column selection to compress the input; second, a table-structure-aware OmniTab QA model guides fine-grained preservation of critical information. The method employs a fine-tuned encoder-decoder architecture that integrates LLM-assisted compression with structured QA guidance. On standard benchmarks, it achieves a ROUGE-L score of 0.4437—surpassing the state-of-the-art REFACTOR by +2.17%—marking the first approach to jointly achieve high accuracy and strong scalability for long-table settings. This work establishes a novel paradigm for controllable, precise summarization under lengthy tabular conditions.

Technology Category

Application Category

📝 Abstract
Query-focused tabular summarization is an emerging task in table-to-text generation that synthesizes a summary response from tabular data based on user queries. Traditional transformer-based approaches face challenges due to token limitations and the complexity of reasoning over large tables. To address these challenges, we introduce DETQUS (Decomposition-Enhanced Transformers for QUery-focused Summarization), a system designed to improve summarization accuracy by leveraging tabular decomposition alongside a fine-tuned encoder-decoder model. DETQUS employs a large language model to selectively reduce table size, retaining only query-relevant columns while preserving essential information. This strategy enables more efficient processing of large tables and enhances summary quality. Our approach, equipped with table-based QA model Omnitab, achieves a ROUGE-L score of 0.4437, outperforming the previous state-of-the-art REFACTOR model (ROUGE-L: 0.422). These results highlight DETQUS as a scalable and effective solution for query-focused tabular summarization, offering a structured alternative to more complex architectures.
Problem

Research questions and friction points this paper is trying to address.

Improves query-focused tabular summarization accuracy
Addresses token limitations and table complexity challenges
Enhances summary quality via selective table reduction
Innovation

Methods, ideas, or system contributions that make the work stand out.

Decomposition-Enhanced Transformers for efficient summarization
Selective table reduction using large language models
Integration with Omnitab for improved ROUGE-L scores
🔎 Similar Papers
No similar papers found.
Yasir Khan
Yasir Khan
Hafr Al Batin University
X
Xinlei Wu
University of Florida, Gainesville, Florida
Sangpil Youm
Sangpil Youm
Ph.D Student, University of Florida
Natural Language ProcessingArtificial IntelligenceNetwork Science
J
Justin Ho
University of Florida, Gainesville, Florida
A
Aryaan Shaikh
University of Florida, Gainesville, Florida
J
Jairo Garciga
University of Florida, Gainesville, Florida
R
Rohan Sharma
University of Florida, Gainesville, Florida
B
Bonnie J. Dorr
University of Florida, Gainesville, Florida