🤖 AI Summary
Existing hash-center-based methods suffer from neglecting inter-class semantic relationships due to random initialization; while two-stage optimization alleviates this issue, it introduces error accumulation and computational redundancy. This paper proposes an end-to-end joint learning framework that dynamically reallocates hash centers within a predefined codebook to adaptively model inter-class semantic structure, while co-optimizing neural hash functions. Our key contributions are: (1) a dynamic center matching mechanism that eliminates the need for explicit center optimization stages; (2) multi-head representation fusion to enhance semantic discriminability; and (3) a codebook-based lightweight architecture. Extensive experiments on three benchmark datasets demonstrate significant improvements over state-of-the-art methods, yielding semantically consistent hash codes with superior retrieval performance.
📝 Abstract
Hash center-based deep hashing methods improve upon pairwise or triplet-based approaches by assigning fixed hash centers to each class as learning targets, thereby avoiding the inefficiency of local similarity optimization. However, random center initialization often disregards inter-class semantic relationships. While existing two-stage methods mitigate this by first refining hash centers with semantics and then training the hash function, they introduce additional complexity, computational overhead, and suboptimal performance due to stage-wise discrepancies. To address these limitations, we propose $ extbf{Center-Reassigned Hashing (CRH)}$, an end-to-end framework that $ extbf{dynamically reassigns hash centers}$ from a preset codebook while jointly optimizing the hash function. Unlike previous methods, CRH adapts hash centers to the data distribution $ extbf{without explicit center optimization phases}$, enabling seamless integration of semantic relationships into the learning process. Furthermore, $ extbf{a multi-head mechanism}$ enhances the representational capacity of hash centers, capturing richer semantic structures. Extensive experiments on three benchmarks demonstrate that CRH learns semantically meaningful hash centers and outperforms state-of-the-art deep hashing methods in retrieval tasks.