Hilbert space methods for approximating multi-output latent variable Gaussian processes

📅 2025-05-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Gaussian processes (GPs) suffer from poor scalability to large-scale multi-output and latent-variable settings due to their cubic computational complexity. To address this, we propose the first Hilbert space approximation framework extended to multi-output kernels and latent input configurations. Our method constructs multi-output kernels via Hilbert space embeddings, jointly optimizes latent inputs and hyperparameters, and enables high-fidelity uncertainty calibration within a scalable variational inference framework. Unlike conventional GP approaches restricted to single-output and explicit inputs, our formulation supports structured output dependencies and end-to-end latent variable learning. Empirical evaluation on synthetic benchmarks and single-cell transcriptomic data demonstrates an improved computational complexity of *O(mn)*, superior uncertainty quantification, and latent variable estimation accuracy comparable to—or exceeding—that of exact GPs. This work establishes a new paradigm for scalable, probabilistic modeling of high-dimensional, multi-output systems with latent structure.

Technology Category

Application Category

📝 Abstract
Gaussian processes are a powerful class of non-linear models, but have limited applicability for larger datasets due to their high computational complexity. In such cases, approximate methods are required, for example, the recently developed class of Hilbert space Gaussian processes. They have been shown to drastically reduce computation time while retaining most of the favourable properties of exact Gaussian processes. However, Hilbert space approximations have so far only been developed for uni-dimensional outputs and manifest (known) inputs. To this end, we generalise Hilbert space methods to multi-output and latent input settings. Through extensive simulations, we show that the developed approximate Gaussian processes are indeed not only faster, but also provides similar or even better uncertainty calibration and accuracy of latent variable estimates compared to exact Gaussian processes. While not necessarily faster than alternative Gaussian process approximations, our new models provide better calibration and estimation accuracy, thus striking an excellent balance between trustworthiness and speed. We additionally validate our findings in a real world case study from single cell biology.
Problem

Research questions and friction points this paper is trying to address.

Extend Hilbert space methods to multi-output Gaussian processes
Generalize approximations for latent input settings
Improve calibration and accuracy while maintaining speed
Innovation

Methods, ideas, or system contributions that make the work stand out.

Generalizes Hilbert space methods to multi-output settings
Extends Hilbert space approximations to latent input scenarios
Balances computational speed with improved calibration accuracy
🔎 Similar Papers
No similar papers found.