TabTune: A Unified Library for Inference and Fine-Tuning Tabular Foundation Models

📅 2025-11-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Current tabular foundation models suffer from deployment-oriented challenges—including heterogeneous preprocessing requirements, fragmented APIs, inconsistent fine-tuning protocols, and the absence of standardized evaluation metrics for calibration and fairness—severely hindering real-world adoption. To address these issues, we propose TabFoundation, the first unified framework for tabular foundation models. It integrates model-aware automatic preprocessing, architecture-decoupled design, and a modular evaluation system supporting zero-shot inference, meta-learning, full-parameter fine-tuning, and diverse parameter-efficient fine-tuning (PEFT) strategies. Crucially, TabFoundation introduces the first cross-model consistent evaluation protocol jointly assessing performance, calibration, and fairness. This significantly improves experimental reproducibility and deployment efficiency. By unifying preprocessing, training, and evaluation standards, TabFoundation advances tabular foundation models toward standardization and industrial scalability.

Technology Category

Application Category

📝 Abstract
Tabular foundation models represent a growing paradigm in structured data learning, extending the benefits of large-scale pretraining to tabular domains. However, their adoption remains limited due to heterogeneous preprocessing pipelines, fragmented APIs, inconsistent fine-tuning procedures, and the absence of standardized evaluation for deployment-oriented metrics such as calibration and fairness. We present TabTune, a unified library that standardizes the complete workflow for tabular foundation models through a single interface. TabTune provides consistent access to seven state-of-the-art models supporting multiple adaptation strategies, including zero-shot inference, meta-learning, supervised fine-tuning (SFT), and parameter-efficient fine-tuning (PEFT). The framework automates model-aware preprocessing, manages architectural heterogeneity internally, and integrates evaluation modules for performance, calibration, and fairness. Designed for extensibility and reproducibility, TabTune enables consistent benchmarking of adaptation strategies of tabular foundation models.
Problem

Research questions and friction points this paper is trying to address.

Standardizing heterogeneous preprocessing pipelines for tabular foundation models
Unifying fragmented APIs and inconsistent fine-tuning procedures
Addressing lack of standardized evaluation for calibration and fairness metrics
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unified library standardizes tabular foundation model workflows
Automates preprocessing and manages architectural heterogeneity internally
Supports multiple adaptation strategies with integrated evaluation modules
🔎 Similar Papers
No similar papers found.