🤖 AI Summary
Hubness—the tendency for a few target instances to become nearest neighbors for many queries—distorts similarity metrics in cross-modal retrieval. To address this, we propose Dual-Bank Sinkhorn Normalization (DBSN), the first method to jointly optimize probability distributions over both query and target banks under unknown query distribution assumptions, achieving bidirectional probabilistic balance. DBSN introduces parallel Sinkhorn iterations over query and target banks to calibrate cross-modal matching distributions, effectively mitigating hubness. Extensive experiments demonstrate consistent improvements across image–text, video–text, and audio–text retrieval benchmarks, validating its effectiveness and cross-modal generalizability. Our core contribution is the first bidirectional distribution alignment mechanism designed for scenarios with unknown query distributions, establishing a novel paradigm for hubness mitigation in cross-modal retrieval.
📝 Abstract
The past decade has witnessed rapid advancements in cross-modal retrieval, with significant progress made in accurately measuring the similarity between cross-modal pairs. However, the persistent hubness problem, a phenomenon where a small number of targets frequently appear as nearest neighbors to numerous queries, continues to hinder the precision of similarity measurements. Despite several proposed methods to reduce hubness, their underlying mechanisms remain poorly understood. To bridge this gap, we analyze the widely-adopted Inverted Softmax approach and demonstrate its effectiveness in balancing target probabilities during retrieval. Building on these insights, we propose a probability-balancing framework for more effective hubness reduction. We contend that balancing target probabilities alone is inadequate and, therefore, extend the framework to balance both query and target probabilities by introducing Sinkhorn Normalization (SN). Notably, we extend SN to scenarios where the true query distribution is unknown, showing that current methods, which rely solely on a query bank to estimate target hubness, produce suboptimal results due to a significant distributional gap between the query bank and targets. To mitigate this issue, we introduce Dual Bank Sinkhorn Normalization (DBSN), incorporating a corresponding target bank alongside the query bank to narrow this distributional gap. Our comprehensive evaluation across various cross-modal retrieval tasks, including image-text retrieval, video-text retrieval, and audio-text retrieval, demonstrates consistent performance improvements, validating the effectiveness of both SN and DBSN. All codes are publicly available at https://github.com/ppanzx/DBSN.