Bayesian Neural Networks for Functional ANOVA model

๐Ÿ“… 2025-10-01
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Traditional ANOVA-TPNNs require pre-specifying the interaction order, limiting scalability to high-order components due to prohibitive computational and memory costsโ€”severely hindering interpretable modeling of high-dimensional functions. This paper proposes Bayesian-TPNN: the first integration of Bayesian inference into the functional ANOVA framework, using tensor-product neural networks as basis functions and efficient MCMC sampling to automatically discover high-order interaction effects, decompose functional components, and quantify uncertainty. We establish theoretical posterior consistency under mild regularity conditions. Empirical evaluations on multiple benchmark datasets demonstrate that Bayesian-TPNN significantly outperforms existing ANOVA-TPNN variants, achieving superior predictive accuracy while maintaining strong interpretability and substantially reducing both computational overhead and memory footprint.

Technology Category

Application Category

๐Ÿ“ Abstract
With the increasing demand for interpretability in machine learning, functional ANOVA decomposition has gained renewed attention as a principled tool for breaking down high-dimensional function into low-dimensional components that reveal the contributions of different variable groups. Recently, Tensor Product Neural Network (TPNN) has been developed and applied as basis functions in the functional ANOVA model, referred to as ANOVA-TPNN. A disadvantage of ANOVA-TPNN, however, is that the components to be estimated must be specified in advance, which makes it difficult to incorporate higher-order TPNNs into the functional ANOVA model due to computational and memory constraints. In this work, we propose Bayesian-TPNN, a Bayesian inference procedure for the functional ANOVA model with TPNN basis functions, enabling the detection of higher-order components with reduced computational cost compared to ANOVA-TPNN. We develop an efficient MCMC algorithm and demonstrate that Bayesian-TPNN performs well by analyzing multiple benchmark datasets. Theoretically, we prove that the posterior of Bayesian-TPNN is consistent.
Problem

Research questions and friction points this paper is trying to address.

Detecting higher-order components in functional ANOVA models efficiently
Overcoming computational constraints of existing ANOVA-TPNN approaches
Providing Bayesian inference for interpretable machine learning decomposition
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bayesian inference for functional ANOVA model
Efficient MCMC algorithm for higher-order components
Posterior consistency proven for Bayesian-TPNN
๐Ÿ”Ž Similar Papers
No similar papers found.
S
Seokhun Park
Department of Statistics, Seoul National University, Seoul, Republic of Korea
C
Choeun Kim
Department of Statistics, Seoul National University, Seoul, Republic of Korea
J
Jihu Lee
Department of Statistics, Seoul National University, Seoul, Republic of Korea
Y
Yunseop Shin
Department of Statistics, Seoul National University, Seoul, Republic of Korea
Insung Kong
Insung Kong
University of Twente
Statistical Machine LearningDeep Learning TheoryTrustworthy AI
Yongdai Kim
Yongdai Kim
Seoul National University
statisticsmachine learning