Hubness Reduction with Dual Bank Sinkhorn Normalization for Cross-Modal Retrieval

📅 2025-08-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Hubness—the tendency for a few target instances to become nearest neighbors for many queries—distorts similarity metrics in cross-modal retrieval. To address this, we propose Dual-Bank Sinkhorn Normalization (DBSN), the first method to jointly optimize probability distributions over both query and target banks under unknown query distribution assumptions, achieving bidirectional probabilistic balance. DBSN introduces parallel Sinkhorn iterations over query and target banks to calibrate cross-modal matching distributions, effectively mitigating hubness. Extensive experiments demonstrate consistent improvements across image–text, video–text, and audio–text retrieval benchmarks, validating its effectiveness and cross-modal generalizability. Our core contribution is the first bidirectional distribution alignment mechanism designed for scenarios with unknown query distributions, establishing a novel paradigm for hubness mitigation in cross-modal retrieval.

Technology Category

Application Category

📝 Abstract
The past decade has witnessed rapid advancements in cross-modal retrieval, with significant progress made in accurately measuring the similarity between cross-modal pairs. However, the persistent hubness problem, a phenomenon where a small number of targets frequently appear as nearest neighbors to numerous queries, continues to hinder the precision of similarity measurements. Despite several proposed methods to reduce hubness, their underlying mechanisms remain poorly understood. To bridge this gap, we analyze the widely-adopted Inverted Softmax approach and demonstrate its effectiveness in balancing target probabilities during retrieval. Building on these insights, we propose a probability-balancing framework for more effective hubness reduction. We contend that balancing target probabilities alone is inadequate and, therefore, extend the framework to balance both query and target probabilities by introducing Sinkhorn Normalization (SN). Notably, we extend SN to scenarios where the true query distribution is unknown, showing that current methods, which rely solely on a query bank to estimate target hubness, produce suboptimal results due to a significant distributional gap between the query bank and targets. To mitigate this issue, we introduce Dual Bank Sinkhorn Normalization (DBSN), incorporating a corresponding target bank alongside the query bank to narrow this distributional gap. Our comprehensive evaluation across various cross-modal retrieval tasks, including image-text retrieval, video-text retrieval, and audio-text retrieval, demonstrates consistent performance improvements, validating the effectiveness of both SN and DBSN. All codes are publicly available at https://github.com/ppanzx/DBSN.
Problem

Research questions and friction points this paper is trying to address.

Addresses hubness problem in cross-modal retrieval
Proposes Dual Bank Sinkhorn Normalization for better hubness reduction
Improves similarity measurement accuracy across diverse retrieval tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Probability-balancing framework reduces hubness effectively
Sinkhorn Normalization balances query and target probabilities
Dual Bank Sinkhorn Normalization narrows distributional gap
🔎 Similar Papers
No similar papers found.
Z
Zhengxin Pan
Zhejiang Key Lab of Accessible Perception & Intelligent Systems, Zhejiang University, Hangzhou, China
Haishuai Wang
Haishuai Wang
Harvard University
Data MiningMachine Learning
F
Fangyu Wu
School of Advanced Technology, Xi’an Jiaotong-Liverpool University, Suzhou, China
P
Peng Zhang
Cyberspace Institute of Advanced Technology, Guangzhou University, Guangzhou, China
J
Jiajun Bu
Zhejiang Key Lab of Accessible Perception & Intelligent Systems, Zhejiang University, China