🤖 AI Summary
This paper addresses the long-standing conceptual divide between Gaussian processes (GPs) and reproducing kernel Hilbert space (RKHS) methods—two fundamental paradigms grounded in positive-definite kernels. We establish a unified theoretical framework rooted in the rigorous isometric isomorphism between the Gaussian Hilbert space and the RKHS. Our method formally proves equivalence between GP-based and RKHS-based solutions across diverse tasks: regression, interpolation, numerical integration, distributional discrepancy measurement (e.g., maximum mean discrepancy), statistical dependence quantification, and sample path analysis. Crucially, this framework bridges the epistemological gap between Bayesian probabilistic modeling and deterministic kernel methods. The results provide a coherent foundation for kernelized Bayesian inference, kernel manifold learning, and other cross-disciplinary applications—enabling principled integration of probabilistic and functional-analytic perspectives within kernel methods. (149 words)
📝 Abstract
This monograph studies the relations between two approaches using positive definite kernels: probabilistic methods using Gaussian processes, and non-probabilistic methods using reproducing kernel Hilbert spaces (RKHS). They are widely studied and used in machine learning, statistics, and numerical analysis. Connections and equivalences between them are reviewed for fundamental topics such as regression, interpolation, numerical integration, distributional discrepancies, and statistical dependence, as well as for sample path properties of Gaussian processes. A unifying perspective for these equivalences is established, based on the equivalence between the Gaussian Hilbert space and the RKHS. The monograph serves as a basis to bridge many other methods based on Gaussian processes and reproducing kernels, which are developed in parallel by the two research communities.