🤖 AI Summary
This paper addresses the challenge of modeling dynamic inter-view couplings in multi-view data by proposing a unified geometric framework based on interleaved Multi-view Diffusion Trajectories (MDT). Methodologically, it constructs a trajectory-dependent non-homogeneous diffusion process via iterative fusion of view-specific random walk operators and introduces a learnable diffusion operator space; geometrically consistent embedding learning is achieved by integrating singular value decomposition with diffusion distance modeling. Theoretical analysis establishes ergodicity of the process, while experiments demonstrate that MDT significantly outperforms state-of-the-art methods on manifold learning and clustering tasks. Key contributions include: (i) the first formulation of multi-view interaction as an interleaved diffusion trajectory, yielding a unified framework with probabilistic interpretability, geometric coherence, and flexible fusion degrees of freedom; and (ii) a principled operator learning and baseline evaluation mechanism grounded in intrinsic quality metrics.
📝 Abstract
This paper introduces a comprehensive unified framework for constructing multi-view diffusion geometries through intertwined multi-view diffusion trajectories (MDTs), a class of inhomogeneous diffusion processes that iteratively combine the random walk operators of multiple data views. Each MDT defines a trajectory-dependent diffusion operator with a clear probabilistic and geometric interpretation, capturing over time the interplay between data views. Our formulation encompasses existing multi-view diffusion models, while providing new degrees of freedom for view interaction and fusion. We establish theoretical properties under mild assumptions, including ergodicity of both the point-wise operator and the process in itself. We also derive MDT-based diffusion distances, and associated embeddings via singular value decompositions. Finally, we propose various strategies for learning MDT operators within the defined operator space, guided by internal quality measures. Beyond enabling flexible model design, MDTs also offer a neutral baseline for evaluating diffusion-based approaches through comparison with randomly selected MDTs. Experiments show the practical impact of the MDT operators in a manifold learning and data clustering context.