rLLM: Relational Table Learning with LLMs

๐Ÿ“… 2024-07-29
๐Ÿ›๏ธ arXiv.org
๐Ÿ“ˆ Citations: 6
โœจ Influential: 1
๐Ÿ“„ PDF
๐Ÿค– AI Summary
To address the low efficiency and lack of a unified framework in relational table learning (RTL) model development, this paper introduces rLLMโ€”a PyTorch-based open-source library enabling modular, collaborative modeling between large language models (LLMs) and graph/table neural networks (GNNs/TabNNs). We propose a novel โ€œcompose-align-co-trainโ€ RTL paradigm and a standardized module decomposition methodology, significantly enhancing rapid model construction and reproducibility. Concurrently, we release three high-quality, benchmark datasets: TML1M (million-scale), TLF2K (fine-grained semantics), and TACM12K (cross-domain multi-task). rLLM has been widely adopted by both academia and industry, establishing a scalable, user-friendly, and unified infrastructure for RTL research and development.

Technology Category

Application Category

๐Ÿ“ Abstract
We introduce rLLM (relationLLM), a PyTorch library designed for Relational Table Learning (RTL) with Large Language Models (LLMs). The core idea is to decompose state-of-the-art Graph Neural Networks, LLMs, and Table Neural Networks into standardized modules, to enable the fast construction of novel RTL-type models in a simple"combine, align, and co-train"manner. To illustrate the usage of rLLM, we introduce a simple RTL method named extbf{BRIDGE}. Additionally, we present three novel relational tabular datasets (TML1M, TLF2K, and TACM12K) by enhancing classic datasets. We hope rLLM can serve as a useful and easy-to-use development framework for RTL-related tasks. Our code is available at: https://github.com/rllm-project/rllm.
Problem

Research questions and friction points this paper is trying to address.

Develops a PyTorch library for relational table learning with LLMs
Decomposes neural networks into modules for fast model construction
Introduces novel datasets and methods for relational table tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Decomposes GNNs, LLMs, TNNs into modules
Enables fast model construction via combining modules
Introduces BRIDGE method for relational table learning
๐Ÿ”Ž Similar Papers
W
Weichen Li
Shanghai Jiao Tong University, China
X
Xiaotong Huang
Shanghai Jiao Tong University, China
J
Jianwu Zheng
Shanghai Jiao Tong University, China
Z
Zheng Wang
Shanghai Jiao Tong University, China
Chaokun Wang
Chaokun Wang
Tsinghua University
DatabaseMultimediaSocial Networks
L
Li Pan
Shanghai Jiao Tong University, China
J
Jianhua Li
Shanghai Jiao Tong University, China