🤖 AI Summary
Existing multimodal VAEs typically rely on simplistic aggregation mechanisms, limiting their capacity to capture higher-order, dynamic inter-modal dependencies. To address this, we propose MRF-VAE—a novel framework that explicitly incorporates Markov Random Fields (MRFs) into both the prior and posterior distributions of a multimodal VAE, enabling joint probabilistic modeling of complex cross-modal dependencies. By integrating multimodal variational inference with joint posterior optimization, MRF-VAE preserves strong generative capability while significantly improving the quality of collaborative representations. On PolyMNIST, it achieves performance competitive with state-of-the-art methods; on a custom synthetic benchmark featuring strong nonlinearity and high-order inter-modal couplings, it substantially outperforms baselines—demonstrating its superior modeling capacity for intricate cross-modal structures. This work establishes a new, interpretable, and structurally grounded paradigm for multimodal generative modeling.
📝 Abstract
Recent advancements in multimodal Variational AutoEncoders (VAEs) have highlighted their potential for modeling complex data from multiple modalities. However, many existing approaches use relatively straightforward aggregating schemes that may not fully capture the complex dynamics present between different modalities. This work introduces a novel multimodal VAE that incorporates a Markov Random Field (MRF) into both the prior and posterior distributions. This integration aims to capture complex intermodal interactions more effectively. Unlike previous models, our approach is specifically designed to model and leverage the intricacies of these relationships, enabling a more faithful representation of multimodal data. Our experiments demonstrate that our model performs competitively on the standard PolyMNIST dataset and shows superior performance in managing complex intermodal dependencies in a specially designed synthetic dataset, intended to test intricate relationships.