Mitra: Mixed Synthetic Priors for Enhancing Tabular Foundation Models

📅 2025-10-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the poor generalization of Tabular Foundation Models (TFMs) on real-world data, focusing on principled design of synthetic prior distributions. While existing TFMs achieve strong performance via pretraining on purely synthetic data, their prior construction lacks systematic guidance. To bridge this gap, we conduct the first systematic analysis of three critical properties of synthetic priors—diversity, discriminability, and high task performance—and propose a multi-prior mixture framework that overcomes the limitations of single synthetic distributions. Furthermore, we integrate Bayesian generative mechanisms with in-context learning (ICL) to enable efficient cross-dataset transfer. Extensive experiments demonstrate that our approach significantly outperforms state-of-the-art models—including TabPFNv2 and TabICL—across multiple classification and regression benchmarks, achieving superior generalization and improved sample efficiency.

Technology Category

Application Category

📝 Abstract
Since the seminal work of TabPFN, research on tabular foundation models (TFMs) based on in-context learning (ICL) has challenged long-standing paradigms in machine learning. Without seeing any real-world data, models pretrained on purely synthetic datasets generalize remarkably well across diverse datasets, often using only a moderate number of in-context examples. This shifts the focus in tabular machine learning from model architecture design to the design of synthetic datasets, or, more precisely, to the prior distributions that generate them. Yet the guiding principles for prior design remain poorly understood. This work marks the first attempt to address the gap. We systematically investigate and identify key properties of synthetic priors that allow pretrained TFMs to generalize well. Based on these insights, we introduce Mitra, a TFM trained on a curated mixture of synthetic priors selected for their diversity, distinctiveness, and performance on real-world tabular data. Mitra consistently outperforms state-of-the-art TFMs, such as TabPFNv2 and TabICL, across both classification and regression benchmarks, with better sample efficiency.
Problem

Research questions and friction points this paper is trying to address.

Identifying key properties of synthetic priors for tabular foundation models
Designing diverse synthetic datasets to enhance model generalization
Improving sample efficiency in tabular classification and regression tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Mixed synthetic priors enhance tabular foundation models
Curated mixture of priors improves diversity and performance
Outperforms state-of-the-art models in classification and regression
🔎 Similar Papers
No similar papers found.