TabMGP: Martingale Posterior with TabPFN

📅 2025-10-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Bayesian inference suffers from prior sensitivity, likelihood misspecification, and high computational cost; while Markovian Gaussian Processes (MGPs) bypass explicit prior-likelihood modeling by relying on predictive rules, they lack effective constructions for tabular data. This paper introduces TabMGP—the first integration of the foundational transformer TabPFN into the MGP framework—leveraging its autoregressive architecture to generate high-quality one-step-ahead predictive distributions, enabling forward uncertainty quantification without explicit priors or likelihoods. TabMGP significantly improves computational efficiency and practicality. Empirically, it achieves coverage rates of prediction intervals close to nominal levels across diverse tabular benchmarks, outperforming both existing MGP variants and standard Bayesian approaches in predictive accuracy and calibration.

Technology Category

Application Category

📝 Abstract
Bayesian inference provides principled uncertainty quantification but is often limited by challenges of prior elicitation, likelihood misspecification, and computational burden. The martingale posterior (MGP, Fong et al., 2023) offers an alternative, replacing prior-likelihood elicitation with a predictive rule - namely, a sequence of one-step-ahead predictive distributions - for forward data generation. The utility of MGPs depends on the choice of predictive rule, yet the literature has offered few compelling examples. Foundation transformers are well-suited here, as their autoregressive generation mirrors this forward simulation and their general-purpose design enables rich predictive modeling. We introduce TabMGP, an MGP built on TabPFN, a transformer foundation model that is currently state-of-the-art for tabular data. TabMGP produces credible sets with near-nominal coverage and often outperforms both existing MGP constructions and standard Bayes.
Problem

Research questions and friction points this paper is trying to address.

Bayesian inference faces prior specification and computational challenges
Martingale posteriors replace priors with predictive forward simulation rules
TabMGP implements martingale posterior using transformer for tabular data
Innovation

Methods, ideas, or system contributions that make the work stand out.

Martingale posterior replaces prior-likelihood with predictive rule
TabMGP uses TabPFN transformer for tabular data modeling
Autoregressive generation enables forward simulation for uncertainty quantification
🔎 Similar Papers
No similar papers found.
K
Kenyon Ng
Department of Econometrics and Business Statistics, Monash University
E
Edwin Fong
Department of Statistics and Actuarial Science, The University of Hong Kong
D
David T. Frazier
Department of Econometrics and Business Statistics, Monash University
Jeremias Knoblauch
Jeremias Knoblauch
Associate professor & EPSRC Fellow @ University College London
post-Bayesian inferencegeneralised Bayesrobustnessvariational methods
Susan Wei
Susan Wei
Monash University
Statistics