A Pre-training Framework for Relational Data with Information-theoretic Principles

📅 2025-07-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Downstream tasks in relational databases exhibit high heterogeneity—arising from diverse schema graph structures, temporal dynamics, and SQL-defined label logic—leading to poor generalization of pretrained table representations. To address this, we propose Task-Aware Pretraining (TAP), a general pretraining framework integrating information-theoretic principles with relational dynamic modeling. Our key contributions are: (1) Task Vector Estimation (TVE), which generates task-aware supervision signals via set-aggregation over schema traversal graphs; (2) joint modeling of relational schema graphs and temporal evolution, explicitly encoding relational dynamics for the next time window; and (3) preservation of task-relevant side-channel information via mutual information maximization. Evaluated on the RelBench benchmark, TAP substantially outperforms state-of-the-art methods, demonstrating the effectiveness of jointly learning task-aware representations and temporal structure.

Technology Category

Application Category

📝 Abstract
Relational databases underpin critical infrastructure across a wide range of domains, yet the design of generalizable pre-training strategies for learning from relational databases remains an open challenge due to task heterogeneity. Specifically, there exist infinitely many possible downstream tasks, as tasks are defined based on relational schema graphs, temporal dependencies, and SQL-defined label logics. An effective pre-training framework is desired to take these factors into account in order to obtain task-aware representations. By incorporating knowledge of the underlying distribution that drives label generation, downstream tasks can benefit from relevant side-channel information. To bridge this gap, we introduce Task Vector Estimation (TVE), a novel pre-training framework that constructs predictive supervisory signals via set-based aggregation over schema traversal graphs, explicitly modeling next-window relational dynamics. We formalize our approach through an information-theoretic lens, demonstrating that task-informed representations retain more relevant signals than those obtained without task priors. Extensive experiments on the RelBench benchmark show that TVE consistently outperforms traditional pre-training baselines. Our findings advocate for pre-training objectives that encode task heterogeneity and temporal structure as design principles for predictive modeling on relational databases.
Problem

Research questions and friction points this paper is trying to address.

Develop generalizable pre-training for relational databases
Address task heterogeneity in schema and temporal dependencies
Enhance task-aware representations via information-theoretic principles
Innovation

Methods, ideas, or system contributions that make the work stand out.

Task Vector Estimation for relational data
Set-based aggregation over schema graphs
Information-theoretic task-aware representations
🔎 Similar Papers
No similar papers found.