π€ AI Summary
This work addresses the limitation of existing distributed multiview image compression methods, which fail to adequately model the varying inter-view correlations during decoding, thereby constraining performance. To overcome this, we propose ParaHydra, an end-to-end framework featuring the OmniParallax attention mechanism that adaptively captures disparity-based correlations between arbitrary view pairs. Additionally, we introduce a Parallax Multi-Information Fusion module to efficiently integrate multisource information within both the decoder and the entropy model. Our approach is the first to significantly outperform state-of-the-art multiview encoders under a distributed setting, achieving bitrate savings of 19.72% and 24.18% on the WildTrack(3) and WildTrack(6) datasets, respectively, while accelerating encoding and decoding by up to 34Γ and 65Γ.
π Abstract
Multi-view image compression (MIC) aims to achieve high compression efficiency by exploiting inter-image correlations, playing a crucial role in 3D applications. As a subfield of MIC, distributed multi-view image compression (DMIC) offers performance comparable to MIC while eliminating the need for inter-view information at the encoder side. However, existing methods in DMIC typically treat all images equally, overlooking the varying degrees of correlation between different views during decoding, which leads to suboptimal coding performance. To address this limitation, we propose a novel $\textbf{OmniParallax Attention Mechanism}$ (OPAM), which is a general mechanism for explicitly modeling correlations and aligned features between arbitrary pairs of information sources. Building upon OPAM, we propose a Parallax Multi Information Fusion Module (PMIFM) to adaptively integrate information from different sources. PMIFM is incorporated into both the joint decoder and the entropy model to construct our end-to-end DMIC framework, $\textbf{ParaHydra}$. Extensive experiments demonstrate that $\textbf{ParaHydra}$ is $\textbf{the first DMIC method}$ to significantly surpass state-of-the-art MIC codecs, while maintaining low computational overhead. Performance gains become more pronounced as the number of input views increases. Compared with LDMIC, $\textbf{ParaHydra}$ achieves bitrate savings of $\textbf{19.72%}$ on WildTrack(3) and up to $\textbf{24.18%}$ on WildTrack(6), while significantly improving coding efficiency (as much as $\textbf{65}\times$ in decoding and $\textbf{34}\times$ in encoding).