Improving Generalization with Flat Hilbert Bayesian Inference

📅 2024-10-05
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the limited generalization capability of conventional Bayesian inference, this paper proposes Flat Hilbert Bayesian Inference (FHBI): a novel framework that enforces flatness constraints on the posterior distribution within a reproducing kernel Hilbert space (RKHS) via alternating optimization between adversarial functional perturbations and functional gradient descent. FHBI is the first to extend generalization theory from finite-dimensional Euclidean spaces to infinite-dimensional function spaces, establishing both RKHS-based Bayesian posterior approximation and theoretically grounded generalization bounds. Evaluated on the VTAB-1K cross-domain benchmark comprising 19 datasets, FHBI consistently outperforms seven baseline methods, achieving substantial average performance gains—demonstrating its efficacy in enhancing generalization and robustness in function space. Key contributions include: (i) the theoretical extension of generalization analysis to infinite-dimensional function spaces; (ii) flatness-driven regularization of RKHS posteriors; and (iii) a new analytically tractable and scalable paradigm for Bayesian inference.

Technology Category

Application Category

📝 Abstract
We introduce Flat Hilbert Bayesian Inference (FHBI), an algorithm designed to enhance generalization in Bayesian inference. Our approach involves an iterative two-step procedure with an adversarial functional perturbation step and a functional descent step within the reproducing kernel Hilbert spaces. This methodology is supported by a theoretical analysis that extends previous findings on generalization ability from finite-dimensional Euclidean spaces to infinite-dimensional functional spaces. To evaluate the effectiveness of FHBI, we conduct comprehensive comparisons against seven baseline methods on the VTAB-1K benchmark, which encompasses 19 diverse datasets across various domains with diverse semantics. Empirical results demonstrate that FHBI consistently outperforms the baselines by notable margins, highlighting its practical efficacy.
Problem

Research questions and friction points this paper is trying to address.

Enhance generalization in Bayesian inference
Extend generalization analysis to infinite-dimensional spaces
Outperform baselines on diverse datasets
Innovation

Methods, ideas, or system contributions that make the work stand out.

FHBI enhances Bayesian inference generalization
Uses adversarial and descent steps in RKHS
Extends theory to infinite-dimensional spaces
🔎 Similar Papers