Probabilistic Pretraining for Neural Regression

📅 2025-08-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Effective modeling frameworks for transfer learning in probabilistic regression have long been lacking. This paper proposes NIAQUE—the first pretraining-finetuning paradigm specifically designed for probabilistic regression. Built upon a permutation-invariant neural architecture, NIAQUE jointly models quantile regression and cross-task knowledge transfer, enabling interpretable estimation of arbitrary quantiles. It is the first work to systematically introduce pretraining into probabilistic regression, performing joint pretraining on diverse, heterogeneous regression datasets and adapting to downstream tasks via lightweight fine-tuning. Extensive experiments demonstrate that NIAQUE significantly outperforms state-of-the-art baselines—including XGBoost, TabPFN, and TabDPT—across multiple real-world regression benchmarks and Kaggle competitions. These results validate its robustness, generalization capability, and scalability.

Technology Category

Application Category

📝 Abstract
Transfer learning for probabilistic regression remains underexplored. This work closes this gap by introducing NIAQUE, Neural Interpretable Any-Quantile Estimation, a new model designed for transfer learning in probabilistic regression through permutation invariance. We demonstrate that pre-training NIAQUE directly on diverse downstream regression datasets and fine-tuning it on a specific target dataset enhances performance on individual regression tasks, showcasing the positive impact of probabilistic transfer learning. Furthermore, we highlight the effectiveness of NIAQUE in Kaggle competitions against strong baselines involving tree-based models and recent neural foundation models TabPFN and TabDPT. The findings highlight NIAQUE's efficacy as a robust and scalable framework for probabilistic regression, leveraging transfer learning to enhance predictive performance.
Problem

Research questions and friction points this paper is trying to address.

Addressing underexplored transfer learning for probabilistic regression
Introducing permutation-invariant model for any-quantile estimation
Enhancing predictive performance through probabilistic pretraining and fine-tuning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Neural Interpretable Any-Quantile Estimation model
Pre-training on diverse regression datasets
Permutation invariance for transfer learning
🔎 Similar Papers
No similar papers found.