๐ค AI Summary
To address the challenge that large language models (LLMs) struggle to capture the two-dimensional structural properties of tabular data under parameter-efficient fine-tuning (PEFT), where conventional serialization discards positional relationships among cells, this paper proposes TabLoRAโthe first LoRA architecture specifically designed for tabular data. TabLoRA introduces a structure-aware token encoder that preserves table geometry during serialization and incorporates a novel two-dimensional LoRA module explicitly modeling row- and column-coordinate dependencies among cells. Evaluated on four representative tabular understanding tasks, TabLoRA consistently outperforms standard LoRA and state-of-the-art tabular encoders, achieving significant gains in structured-data comprehension with minimal parameter overhead. This work establishes a new paradigm for lightweight, table-aware LLM adaptation.
๐ Abstract
Tabular data are crucial in many fields and their understanding by large language models (LLMs) under high parameter efficiency paradigm is important. However, directly applying parameter-efficient fine-tuning (PEFT) techniques to tabular tasks presents significant challenges, particularly in terms of better table serialization and the representation of two-dimensional structured information within a one-dimensional sequence. To address this, we propose TableLoRA, a module designed to improve LLMs' understanding of table structure during PEFT. It incorporates special tokens for serializing tables with special token encoder and uses 2D LoRA to encode low-rank information on cell positions. Experiments on four tabular-related datasets demonstrate that TableLoRA consistently outperforms vanilla LoRA and surpasses various table encoding methods tested in control experiments. These findings reveal that TableLoRA, as a table-specific LoRA, enhances the ability of LLMs to process tabular data effectively, especially in low-parameter settings, demonstrating its potential as a robust solution for handling table-related tasks.