๐ค AI Summary
This work addresses the challenge of maintaining geometric consistency across novel views in generative models, particularly diffusion models, whose stochastic generation process often leads to inconsistencies. To overcome this limitation, the authors propose a deterministic data-to-data flow matching framework that, for the first time, incorporates probability densityโinduced geodesic interpolation into flow matching. By leveraging high-density regions of a pretrained diffusion model to guide interpolation trajectories between views and integrating data coupling constraints to reinforce structural coherence, the method achieves significantly improved performance over diffusion-based baselines. Experimental results demonstrate notable gains in both geometric consistency and smoothness of view transitions in novel view synthesis tasks.
๐ Abstract
Recent advances in generative modeling have substantially enhanced novel view synthesis, yet maintaining consistency across viewpoints remains challenging. Diffusion-based models rely on stochastic noise-to-data transitions, which obscure deterministic structures and yield inconsistent view predictions. We propose a Data-to-Data Flow Matching framework that learns deterministic transformations directly between paired views, enhancing view-consistent synthesis through explicit data coupling. To further enhance geometric coherence, we introduce Probability Density Geodesic Flow Matching (PDG-FM), which constrains flow trajectories using geodesic interpolants derived from probability density metrics of pretrained diffusion models. Such alignment with high-density regions of the data manifold promotes more realistic interpolants between samples. Empirically, our method surpasses diffusion-based NVS baselines, demonstrating improved structural coherence and smoother transitions across views. These results highlight the advantages of incorporating data-dependent geometric regularization into deterministic flow matching for consistent novel view generation.