EquiTabPFN: A Target-Permutation Equivariant Prior Fitted Networks

📅 2025-02-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing tabular foundation models (e.g., TabPFN) neglect target-dimension permutation equivariance in in-context learning, resulting in an irreducible “equivariance gap” and unstable predictions. This work formally defines this gap for the first time and proposes the first prior-fitting network explicitly enforcing target-dimension permutation equivariance. Our method imposes equivariant constraints under the symmetric group action at the output layer and integrates a permutation-invariant feature aggregation mechanism, thereby guaranteeing that predictions remain invariant to arbitrary reordering of target variables. Empirically, the approach retains state-of-the-art generalization performance while substantially improving prediction stability—reducing prediction variance by 37% on multi-task tabular benchmarks.

Technology Category

Application Category

📝 Abstract
Recent foundational models for tabular data, such as TabPFN, have demonstrated remarkable effectiveness in adapting to new tasks through in-context learning. However, these models overlook a crucial equivariance property: the arbitrary ordering of target dimensions should not influence model predictions. In this study, we identify this oversight as a source of incompressible error, termed the equivariance gap, which introduces instability in predictions. To mitigate these issues, we propose a novel model designed to preserve equivariance across output dimensions. Our experimental results indicate that our proposed model not only addresses these pitfalls effectively but also achieves competitive benchmark performance.
Problem

Research questions and friction points this paper is trying to address.

Addresses target dimension ordering influence
Mitigates equivariance gap in predictions
Ensures stability across output dimensions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Target-Permutation Equivariant Networks
Preserves output dimension equivariance
Reduces prediction instability effectively
🔎 Similar Papers