Multi-View Oriented GPLVM: Expressiveness and Efficiency

📅 2025-02-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the dual bottlenecks of limited kernel expressiveness and inefficient variational inference in multi-view Gaussian process latent variable models (MV-GPLVM), this paper introduces a spectral-density-driven kernel construction paradigm. We establish an explicit duality between spectral densities and kernel functions, and propose a highly expressive Next-Gen Spectral Mixture (NGSM) kernel. By integrating bivariate Gaussian mixture spectral density modeling, random Fourier feature approximation, and reparameterization, we enable scalable variational inference. The proposed method is embedded within a variational autoencoder framework and achieves state-of-the-art performance on multiple multi-view benchmark datasets—outperforming existing MV-GPLVMs and deep multi-view models. It learns unified latent representations that are more robust, interpretable, and generalizable.

Technology Category

Application Category

📝 Abstract
The multi-view Gaussian process latent variable model (MV-GPLVM) aims to learn a unified representation from multi-view data but is hindered by challenges such as limited kernel expressiveness and low computational efficiency. To overcome these issues, we first introduce a new duality between the spectral density and the kernel function. By modeling the spectral density with a bivariate Gaussian mixture, we then derive a generic and expressive kernel termed Next-Gen Spectral Mixture (NG-SM) for MV-GPLVMs. To address the inherent computational inefficiency of the NG-SM kernel, we propose a random Fourier feature approximation. Combined with a tailored reparameterization trick, this approximation enables scalable variational inference for both the model and the unified latent representations. Numerical evaluations across a diverse range of multi-view datasets demonstrate that our proposed method consistently outperforms state-of-the-art models in learning meaningful latent representations.
Problem

Research questions and friction points this paper is trying to address.

Enhances kernel expressiveness in MV-GPLVM
Improves computational efficiency with approximation
Outperforms state-of-the-art in latent representations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Introduces Next-Gen Spectral Mixture kernel
Uses random Fourier feature approximation
Implements scalable variational inference
🔎 Similar Papers
No similar papers found.